Build unparalleled augmented reality experiences for hundreds of millions of users on iOS and iPadOS, the biggest AR platforms in the world. With powerful frameworks like ARKit and RealityKit, and creative tools like Reality Composer and Reality Converter, it’s never been easier to bring your ideas to life in AR. ARKit 3.5 uses the new LiDAR Scanner and depth-sensing system on iPad Pro to make AR experiences more realistic than ever before. The new Scene Geometry API lets you capture a 3D representation of the world in real time, enabling object occlusion and real-world physics for virtual objects. All experiences enabled by ARKit automatically benefit from new instant AR placement, and improved Motion Capture and People Occlusion.
Integrate iOS device camera and motion features to produce augmented reality experiences in your app or game. Augmented reality (AR) describes user experiences that add 2D or 3D elements to the live view from a device's camera in a way that makes those elements appear to inhabit the real world. ARKit combines device motion tracking, camera scene capture, advanced scene processing, and display conveniences to simplify the task of building an AR experience. You can create many kinds of AR experiences with these technologies using the front or rear camera of an iOS device.
* Scene Geometry: lets you create a topological map of your space with labels identifying floors, walls, ceilings, windows, doors, and seats. This deep understanding of the real world unlocks object occlusion and real-world physics for virtual objects, and also gives you more information to power your AR workflows.
* Instant AR: The LiDAR Scanner on iPad Pro enables incredibly quick plane detection, allowing for the instant placement of AR objects in the real world without scanning. Instant AR placement is automatically enabled on iPad Pro for all apps built with ARKit, without any code changes.
* Improved Motion Capture and People Occlusion: With ARKit 3.5 on iPad Pro, depth estimation in People Occlusion and height estimation in Motion Capture are more accurate. These two features improve on iPad Pro in all apps built with ARKit, without any code changes.
* Motion Capture: Capture the motion of a person in real time with a single camera. By understanding body position and movement as a series of joints and bones, you can use motion and poses as an input to the AR experience — placing people at the center of AR.
* Collaborative Sessions: With live collaborative session between multiple people, you can build a collaborative world map, making it faster for you to develop AR experiences and for users to get into shared AR experiences like multiplayer games.
* People Occlusion: AR content realistically passes behind and in front of people in the real world, making AR experiences more immersive while also enabling green screen-style effects in almost any environment.
* Multiple Face Tracking: ARKit Face Tracking tracks up to three faces at once, using the TrueDepth camera on iPhone X, iPhone XS, iPhone XS Max, iPhone XR, and iPad Pro to power front-facing camera experiences like Memoji and Snapchat.
* Simultaneous Front and Back Camera: You can simultaneously use face and world tracking on the front and back cameras, opening up new possibilities. For example, users can interact with AR content in the back camera view using just their face.
* Additional Improvements: Detect up to 100 images at a time and get an automatic estimate of the physical size of the object in the image. 3D object detection is more robust, as objects are better recognized in complex environments. And now, machine learning is used to detect planes in the environment even faster.
Meet Augmented Reality Agency Nsocial if you have a webAR project with augmented reality via mobile browsers or website without the need for applications for your brand, or if you want to describe your product or service with Google Augmented Reality Search Results experience!