A Chinese consultation model based on ChatGLM-6B Contribute to xionghonglin/DoctorGLM development by creating an account on GitHub.
Reading: 37 2023-07-23
SoulChat, a large model for mental health dialogue in the Chinese field Contribute to scutcyr/SoulChat development by creating an account on GitHub. SoulChat, based on the six characteristics of proactivity, prevention, precision, personalization, co construction and sharing, and self-discipline of active health, has opened up the ProActiveHealthGPT, a large-scale model base for active health in Chinese living spaces, which includes: BianQue, a life space health model fine tuned with millions of Chinese health dialogue data instructions SoulChat, a mental health model fine tuned through a combination of Chinese long text instructions and multiple rounds of empathy dialogue data in the field of millions of psychological counseling We hope that the ProActiveHealthGPT, a large-scale model for active health in living spaces, can help accelerate the research and application of large-scale models in the fields of chronic diseases, psychological counseling, and other active health areas in academia. This project is a mental health model called SoulChat.
Reading: 194 2023-07-23
QiZhenGPT is an open-source Chinese medical language model QiZhenGPT utilized the Qizhen Medical Knowledge Base to construct a Chinese medical instruction dataset, and based on this, fine tuned the instructions on the Chinese-LAMA-Plus-7B, CaMA-13B, and ChatGLM-6B models, significantly improving the effectiveness of the models in Chinese medical scenarios. Firstly, an evaluation dataset was released for drug knowledge Q&A. Subsequently, plans were made to optimize the Q&A effectiveness in areas such as disease, surgery, and testing, and expand applications such as doctor-patient Q&A and automatic medical record generation.
Reading: 136 2023-07-22
BianQue, based on the six characteristics of proactivity, prevention, precision, personalization, co construction and sharing, and self-discipline of active health, has opened up the ProActiveHealthGPT, a large-scale model base for active health in Chinese living spaces, which includes: BianQue, a life space health model fine tuned with millions of Chinese health dialogue data instructions SoulChat, a mental health model fine tuned through a combination of Chinese long text instructions and multiple rounds of empathy dialogue data in the field of millions of psychological counseling We hope that the ProActiveHealthGPT, a large-scale model for active health in living spaces, can help accelerate the research and application of large-scale models in the fields of chronic diseases, psychological counseling, and other active health areas in academia. This project is BianQue, a large-scale model of living space health.
Reading: 116 2023-07-23
Vite (French for "fast", pronounced/vit/, pronounced the same as "veet") is a new front-end construction tool that can significantly enhance the front-end development experience. It mainly consists of two parts: A development server that provides rich built-in features based on native ES modules, such as astonishingly fast module hot updates (HMR). A set of build instructions that use Rollup to package your code and are pre configured to output highly optimized static resources for production environments. Vite aims to provide out of the box configuration, while its plugin API and JavaScript API bring high scalability and complete type support. Fast service startup Use native ESM files without packaging! Lightweight and fast thermal overload The ultimate fast module hot reload (HMR) regardless of the size of the application Rich features Support TypeScript, JSX, CSS, etc. out of the box. Optimized construction Pre configured Rollup build with optional "multi page application" or "library" modes Universal plugins Share the Rollup Superset plugin interface between development and build. Fully typed API Flexible API and complete TypeScript types.
Reading: 88 2022-01-02
Sketch Chinese website is a community that introduces Sketch this Mac design tool with Chinese content. Here we share the latest Sketch Chinese manual and use skills.
Reading: 69 2019-03-27
Under the wave of ChatGPT, the continuous expansion and development of artificial intelligence have provided fertile soil for the spread of LLM. Currently, the fields of healthcare, education, and finance have gradually developed their own models, but there has been no significant progress in the legal field. In order to promote open research on the application of LLM in law and other vertical fields, this project has open-source the Chinese legal model and provided a reasonable solution for the combination of LLM and knowledge base in legal scenarios. The current open source versions of ChatLaw legal model for academic reference are Jiangziya-13B and Anima-33B. We use a large amount of original texts such as legal news, legal forums, laws, judicial interpretations, legal consultations, legal exam questions, and judgment documents to construct dialogue data. The model based on Jiangziya-13B is the first version of the model. Thanks to Jiang Ziya's excellent Chinese language ability and our strict requirements for data cleaning and data augmentation processes, we perform well in logically simple legal tasks, but often perform poorly in complex logical legal reasoning tasks. Subsequently, based on Anima-33B, we added training data and created ChatLaw-33B, which showed a significant improvement in logical reasoning ability. Therefore, it can be seen that large parameter Chinese LLM is crucial. Our technical report is here: arXiv: ChatLaw The version trained based on commercially available models will be used as the internal integration version for our subsequent products and is not open source to the outside world. You can try out the open source version of the model here
Reading: 55 2023-07-23
The goal of this project is to promote the development of the open source community for Chinese dialogue big models, with the vision of becoming an LLM Engine that can help everyone. Compared to how to do well in pre training of large language models, BELLE focuses more on how to help everyone obtain their own language model with the best possible instruction expression ability on the basis of open-source pre training of large language models, and reduce the research and application threshold of large language models, especially Chinese large language models. To this end, the BELLE project will continue to open up instruction training data, related models, training code, application scenarios, etc., and will also continuously evaluate the impact of different training data, training algorithms, etc. on model performance. BELLE has been optimized for Chinese, and model tuning only uses data produced by ChatGPT (excluding any other data).
Reading: 54 2023-07-22
What is chineasy?Author: Xue Xiaolan (transliteration) As a daughter of a calligrapher who grew up in Taiwan, my earliest and most precious memory was my mother showing me the beauty, shape and form of Chinese characters to me.Since then, I have been deeply attracted by the structure of this incredible language. But in the eyes of outsiders, it must be as indestructible as the Great Wall of China.Over the years, I often want to know if I can find a way to break this wall, so anyone who wants to understand and appreciate this charming language can do this through their own eyes. Twelve years ago, I moved to the United Kingdom and registered for enrollment at the University of Cambridge.Two years later, I had a degree and two children.When I started a new life, I observed how popular China is, how much people are eager to accept this culture -but they are struggling in language.Even my own children think this is awesome. Since then, I have begun to think about how a new and simpler way to read Chinese is useful.During the day, I am an Internet entrepreneur and venture capitalist. In the evening, I am busy creating a system that makes Chinese learning Chinese simple. Methodology I invented a way to learn written language by understanding their basic shapes and intentions.Once they can identify the main characters of several groups, they can combine them and easily learn dozens of additional characters.By repeating the process of "identification and combination", students can quickly learn hundreds of Chinese characters in a very short period of time. A person who studies Chinese will master amazing 20,000 Chinese characters, but to reach the basic level of literacy, you only need to know about 1,000.But the most interesting thing is that if you can learn the top 200 Chinese characters, it is enough to read about 40 % of Chinese popular literature.This is enough for many practical applications, such as reading road signs and restaurant menus, as well as the main points of mastering newspapers and websites. In the past two years, I have deconstructed nearly two thousand most commonly used Chinese characters and found the top 100 "components".These components are similar to different shapes and sizes.In order to create more characters, you only need to combine the two or three (sometimes four or five) of these components.For non -Chinese readers, this is a really powerful tool for quickly and effective understanding of the meaning of Chinese characters.The memory process is interesting, fast and lasting because it uses a vivid and interesting display method.I hope you will find that learning Chinese characters is as interesting as the methodology behind I create Chinese.
Reading: 45 2019-03-27
The National Gallery of Victoria (NGV) is the most popular art gallery in Australia. The collection in the museum is extremely rich, including many world-renowned first-class artworks, which are exhibited in two exhibition halls: NGV International and NGV Australia: Ian Potter Centre. The International Pavilion is used to collect international art exhibits from NGV and is open from 10:00 to 17:00 every day. The Australian Pavilion is responsible for showcasing local Australian art (including works by Indigenous and Torres Strait Islander peoples) and is open from 10:00 to 17:00 daily. NGV collections are open for free admission, but special exhibitions may charge admission fees.
Reading: 31 2019-05-20
The first open-source 33B Chinese language model based on QLoRA. The AI community has always been very open, and the development of AI today cannot be separated from many important open source works, open shared papers, or open source data and code. We believe that the future of AI will also be open. I hope to make some contributions to the open source community. Why is the 33B model important? Is QLoRA a Game Changer? Previously, most of the open-source models that could be finetuned were relatively small models 7B or 13b, although they could perform well through finetune training on some simple chatbot evaluation sets. However, due to the limited scale of these models, the reasoning ability of LLM core is relatively weak. That's why many of these small-scale models behave like toys in practical application scenarios. As discussed in this work, the chatbot evaluation set is relatively simple, and there is still a clear gap between small and large models in complex logical reasoning and mathematical problems that truly test the model's ability. Therefore, we believe that the work of QLoRA is very important, to the point where it could be a Game Changer. Through the optimization method of QLoRA, for the first time, a 33B scale model can be trained with a more democratic and low-cost finetune, and widely used. We believe that the 33B model can not only leverage the strong reasoning ability of large-scale models, but also flexibly fine tune training for private business domain data to enhance control over LLM.
Reading: 21 2023-07-23