⚡️ Spatially Aware

PLUS: Google to invest $2B in OpenAI rival Anthropic

It’s Monday. Facilitating spatial knowledge of models is a major issue in vision-language learning research. Researchers seem to have released a groundbreaking MLLM that combines capabilities of referencing and grounding. Let’s dive right in.

Today’s Highlights:

  • Google to invest $2B in OpenAI rival Anthropic

  • Bard now can respond in real time

  • UN assembles board to address AI governance

DEEP DIVE

Columbia & Apple Unveil Groundbreaking MLLM Ferret

Researchers from Columbia University and Apple have unveiled Ferret, an advanced Multimodal LLM that successfully combines referencing and grounding essential in vision-language learning.

Navigating multiple forms of regions such as strokes, scribbles, or intricate polygons can be a task. Engaging a spatial-aware visual sampler, Ferret can acquire optical characteristics for areas in all shapes—handling input which includes free-form text and referenced areas.

But what makes Ferret hold its ground?

  • The researchers added an open-vocabulary, instruction-following feature to make the model practical for user applications.

  • A new dataset, Ground-and-Refer Instruction-Tuning (GRIT), mixes location and text in both input and output for training.

  • By employing ChatGPT/GPT-4, they also collect refer-and-ground instruction-tuning chats and execute spatially aware negative data mining to boost model robustness.

The ‘Ferret-Bench’ evaluation has been used to understand how Ferret fared in its tasks. The performance metrics evaluated included Referring Description, Referring Reasoning, and Grounding in Conversation. Taking on competing MLLMs, Ferret outperformed them by an average of 20.4%—and even exhibited the ability to reduce object hallucinations.

PUNCHLINES

Money Talks: Google has reportedly agreed to back OpenAI rival Anthropic with $2B.

Rear-View Future: Nextbase is revolutionizing driver safety with its new AI-based smart dash cam.

Artificial Lifesaver: Sunak announces focus on treating cancer and dementia with £100 Million AI fund.

Bugging Out: Google expands its bug bounty program to include generative AI vulnerabilities.

TLDR

Bard now responds in real-time: Google's chatbot Bard has been updated to provide real-time responses. It now also includes a 'Skip response' feature, which allows users to cut off the chatbot mid-sentence if unsatisfied with its response. Other updates include the ability to view uploaded images in shared conversations and advanced email summarizing capabilities.

New robot arm sensor mimics human skin: Researchers from the University of British Columbia and Honda's research institute have developed a soft sensor for robot arms that mimics the texture and sensitivity of human skin. This advanced, stretchable sensor can detect various force types, allowing for more sophisticated and sensitive movements.

UN assembles board to address AI governance: The United Nations has established a 39-member AI advisory board. Drawn from academia, government, and industry, the board will provide recommendations on the international governance of AI. Its initial meeting took place on Oct 27, and it’s expected to present its recommendations by summer 2024.

AI model YOLOv5 to spot urban decay: Stanford and Notre Dame researchers have created an AI program, the YOLOv5 model, that can identify signs of urban decay—such as potholes, graffiti, and neglected buildings. The model performed well in densely populated areas of cities like South Bend, Indiana, Mexico City, and San Francisco—giving timely and accurate data to public and nongovernmental organizations.

TRENDING TOOLS

🔍 Julius: Analyze your data and files using AI

💼 Resolve AI: Instantly resolve up to 50% of your support tickets with AI

💻 Olle: Implement ChatGPT on your Mac for seamless interactions

📥 Spoke: An AI-powered priority inbox for product builders

⚖️ Genie AI: Your personal AI legal assistant to simplify complex tasks

That’s all for today—if you have any questions or something interesting to share, please reply to this email. We’d love to hear from you!

P.S. If you want to sign up for the Supercharged newsletter or share it with a friend, you can find us here.

Reply

or to participate.