The large language model of the OpenGPT-X research project is now available for download on Hugging Face: “Teuken-7B” has been trained from scratch in all 24 official languages of the European Union (EU) and contains seven billion parameters. Researchers and companies can leverage this commercially usable open source model for their own artificial intelligence (AI) applications. Funded by the German Federal Ministry of Economic Affairs and Climate Action (BMWK), the OpenGPT-X consortium – led by the Fraunhofer Institutes for Intelligent Analysis and Information Systems IAIS and for Integrated Circuits IIS – have developed a large language model that is open source and has a distinctly European perspective.

[…]

The path to using Teuken-7B

Interested developers from academia or industry can download Teuken-7B free of charge from Hugging Face and work with it in their own development environment. The model has already been optimized for chat through “instruction tuning”. Instruction tuning is used to adapt large language models so that the model correctly understands instructions from users, which is important when using the models in practice – for example in a chat application.

Teuken-7B is freely available in two versions: one for research-only purposes and an “Apache 2.0” licensed version that can be used by companies for both research and commercial purposes and integrated into their own AI applications. The performance of the two models is roughly comparable, but some of the datasets used for instruction tuning preclude commercial use and were therefore not used in the Apache 2.0 version.

Download options and model cards can be found at the following link: https://huggingface.co/openGPT-X

  • LesserAbe@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    29 days ago

    Sounds pretty good. From what I understand because of the massive compute required to train LLMs the only groups that can afford to do it are large corporations or states. And with this model because it’s open source maybe less energy use from a bunch of different organizations creating their own models.