News provided by aibrews.com
- Technology Innovation Institute in Abu Dhabi has released Falcon 180B - a large language model with 180 billion parameters, trained on 3.5 trillion tokens. It's currently the largest openly available model, and rivals proprietary models like PaLM-2. Falcon 180B is 2.5 times larger than Llama 2 and was trained with 4x more compute. It is available for both research and commercial use [Details].
- Meta AI released Belebele, a first-of-its-kind multilingual reading comprehension dataset spanning 122 language variants, enabling direct comparison of how well models understand different languages [Details].
- Meta AI has published Code Llama’s research paper with more information on training, evaluation results and safety [Paper].
- Open Interpreter, an open-source, locally running implementation of OpenAI's Code Interpreter hit 6K+ stars in just 24 hours after launching v0.1. Crossed 10K stars now [Details].
- Retool, the low-code platform for building internal tools has launched Retool AI, for faster building of custom AI apps, workflows and chatbots visually or with code [Details].
- Artificial Intelligence Film Festival (AIFF) announced by Expo City Dubai with submissions open until December 1 [Details].
- OpenAI’s first developer conference will be in San Francisco on November 6, with registration for in-person attendance opening soon [Details].
- Researchers have developed an AI model to predict the odor profile of a molecule, just based on its structure [Details].
- AI Grant announced the startups accepted into the second batch of AI Grant [Details].
- IBM announced a new family of generative AI foundation models, named Granite. Granite.13b.instruct and Granite.13b.chat use a Decoder architecture and are more efficient than larger models, fitting onto a single V100-32GB GPU [Details]
- Microsoft has filed a patent for an AI-powered backpack design with smart sensors. Wearers will benefit from AI enhanced object identification and analysis, nearby device interaction, and be able to gain contextual insights [Details].
- Lomonosov Moscow State University (MSU) has launched its new supercomputer named MSU-270 with a peak computational power of 400 'AI' PetaFLOPS, that will be used for various AI and high-performance computing (HPC) applications and for training large AI models [Details].
- Anthropic, has launched a paid version of its Claude chatbot in the US and UK, priced at $20/ £18 per month [Details].
- X’s privacy policy confirms it will use public data to train AI models [Details].
- Hugging Face has launched Training Cluster As a service, letting users train their large model on the Hugging Face GPUs cluster [Link].
- Zoom's Generative AI Assistant, Zoom IQ gets rebranded as Zoom AI Companion with new capabilities [Details].
🔦 Weekly Spotlight
- Inside Elon Musk's Struggle for the Future of AI by Walter Isaacson [Link].
- How generative AI helped train Amazon One to recognize your palm [Link].
- Why Meta’s Yann LeCun isn’t buying the AI doomer narrative [Link].
- TinyLlama project: an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens [link].
—-------
Welcome to the r/artificial weekly megathread. This is where you can discuss Artificial Intelligence - talk about new models, recent news, ask questions, make predictions, and chat other related topics.
Click here for discussion starters for this thread or for a separate post.
Self-promo is allowed in these weekly discussions. If you want to make a separate post, please read and go by the rules or you will be banned.
[link] [comments]