The company has formed partnerships with 13 universities in 9 countries. These organizations have compiled over 2,200 hours of first-person video from 700 participants’ daily lives.
Madrid (Portaltic/EP).- Facebook announced Thursday its long-term Ego4D project. This involves artificial intelligence (AI), which seeks to improve users’ perceptions of people in daily life, including first-person data.
Facebook’s new project is a challenge. AI can interact and understand from a self-centered perspective just like humans. But, algorithms such as image algorithms use third-person video input which complicates the matter.
Kristen Grauman (Facebook’s research director), stated that the next generation of artificial intelligence systems must learn from completely new types of data. These include videos that show the world at the center of the action, rather than the edges. In a statement, the company lists all collected data.
These surveys will help to create the necessary pillars so virtual assistants based artificial intelligence, augmented reality glass and robots can help people in their daily lives such as making sure they find lost keys, making meals, or remembering the exact holograms.
The company has formed partnerships with 13 universities in 9 countries. These institutions compiled more than 2,200 hours worth of first-person videos from 700 participants who recorded their daily life.
This audiovisual resource library will increase data availability to the scientific community. The range of the library is 20 times larger than other resources libraries, based upon the hours spent storing images.
Facebook AI has created five benchmarks in collaboration with the alliance, Facebook Reality Labs Research and FRL Research. These first-person experiences will be used to drive the development of these applications in real life for future personal assistant apps.
These pillars include episodic memory, predictive capability, manipulating items with hands, audiovisual memories, social interaction, and manipulating objects using hands. All of these areas are areas in which artificial intelligence is not capable of copying human first person perspectives.
Facebook hopes that these five pillars will allow artificial intelligence to interact not only with humans in the real world, but also in the metauniverse. This meta-universe includes augmented reality and virtual realities.
Grauman Says that artificial intelligence will “not only start to better understand its environment, but can also be personalized on a personal basis; it can make your favorite coffee or guide you on your next family vacation itinerary.”
The Ego4D University Consortium will make available data this year that can be used only for authorized purposes. Researchers will invite experts to take part in a challenge to teach machines how to understand the world.