According to a recent report by The Verge, Lightship will be “built around the parts necessary to stitch together the digital and the real world.” Niantic CEO John Hanke told the publication that the platform will let mobile apps identify whether a user is pointing a camera towards the sky or water.
It will let users map different surfaces and measure the depth of an environment in real-time. It will also let users place a virtual object behind a physical one, presumably, by leveraging the company’s Reality Blending technology.
The Lightship platform has been in the development phase for quite some time. However, the company has now opened it for developers to access its features. Although the software toolkit comes as a free package, the developers need to pay for a feature that enables them to access shared AR experiences through multiple devices simultaneously.
Furthermore, the Niantic CEO says that the company is already planning to release a major update for Lightship next year. It will bring a feature called a “visual positioning system” that will specifically cater to AR glasses. With this new system, AR glasses with displays will be able to identify the position of the user in the real world and keep a virtual object anchored to a specific position in the real world. So, this could be hugely helpful for the AR Glass that Niantic is developing in partnership with Qualcomm.
Moreover, Hanke thinks that Lightship will provide a feature-packed platform for developers as it supports both iOS and Android. Plus, with companies like Meta and Microsoft working to convert the idea of metaverse into reality, Lightship will be a relevant platform for developers in the future.