- Elon Musk’s xAI has open-sourced the base code of Grok AI model, but without any training code. The company described it as the “314 billion parameter Mixture-of-Expert model” on GitHub.[1]
- Apple Announces MM1: A Family of Multimodal LLMs Up To 30B Parameters that are SoTA in Pre-Training Metrics and Perform Competitively after Fine-Tuning.[2]
- Microsoft tells European regulators Google has an edge in generative AI.[3]
- Nvidia’s Jensen Huang, Fed’s Powell may rock markets this week.[4]
Sources:
[1] https://techcrunch.com/2024/03/17/xai-open-sources-base-model-of-grok-but-without-any-training-code/
[4] https://www.thestreet.com/investing/nvidia-jensen-huang-fed-powell-rock-markets-this-week
[link] [comments]