Body Labs | |
Type: | Subsidiary |
Location: | Manhattan, New York |
Industry: | Artificial intelligence |
Founders: | Michael J. Black William J. O'Farrell Eric Rachlin Alex Weiss |
Parent: | Amazon.com |
Body Labs is a Manhattan-based software company founded in 2013. Body Labs is a software provider of human-aware artificial intelligence that understands the 3D body shape and motion of people from RGB photos or videos.[1]
In October 2017, the company was acquired by Amazon.[2]
Body Labs was founded by Michael J. Black, William J. O'Farrell, Eric Rachlin, and Alex Weiss who were connected at Brown University and Max Planck Institute for Intelligent Systems.[3]
In 2002, Black was researching how to create a statistical model of the human body. While Black was teaching a course on computer vision at Brown University, the Virginia State Police contacted him about a robbery and murder at a 7-Eleven. The police wanted to use computer vision to identify the suspect in a surveillance video. By creating a statistical model, Black's group could vindicate some of the evidence in the case like confirming the suspect's height.[4]
On November 13, 2014, Body Labs announced $2.2 million in Seed funding led by FirstMark Capital, with additional investors including New York Angels and existing investors.[5]
On November 3, 2015, Body Labs announced $11 million in Series A funding led by Intel Capital, with additional investors including FirstMark Capital, Max-Planck-Innovation GmbH, Osage University Partners, Catalus Capital and the company founders.[6]
On March 3, 2015, Body Labs launched BodyKit, a collection of API’s and embeddable components for integrating the human body into apps and tools.[7] [8]
On July 20, 2016, Body Labs launched Body Labs Blue, an API and embeddable Web interface that takes physical measurements and predicts additional digital measurements to help with custom clothing creation.[9] [10]
On October 5, 2016, Body Labs launched Body Labs Red, an API for automatically processing 3D scans into a full 3D body model. Additionally, Body Labs announced a partnership with 3dMD to process their 3D scans.[11] [12]
On Feb. 15, 2017, Body Labs released Mosh on the App Store, an Apple iOS app, the predicts the 3D human pose and shape of a subject and renders 3D effects on them.[13] [14]
On June 1, 2017, Body Labs launched SOMA, software that uses artificial intelligence to predict 3D human shape and motion from RGB photos or video.[15] [16]
On July 21, 2017, Body Labs launched SOMA Shape API for 3D model and Measurement Prediction. Shape API allows third party apps to easily connect to the SOMA backend.[17]