Apple Maps is essential to future innovation, from autonomous vehicles to AR experiences, so it’s noteworthy that Apple’s own Indoor Mapping Data Format (IMDF) has been adopted as a community standard for indoor positioning services.
What is IMDF?
IMDF has been accepted as a community standard by the Open Geospatial Consortium in response to proposals from a cross-industry group that includes Apple, Google, Autodesk, and Ordnance Survey.
It enables a venue, organization or industry to create fully customized indoor maps using industry-standard tools. The maps use existing Wi-Fi networks to enable GPS-level location accuracy so visitors can navigate interior spaces using their device.
Scott Simmons, chief standards officer at the Open Geospatial Consortium, had this to say:
“With IMDF now part of the OGC Standards Baseline, we look forward to deeper integration with other geospatial Standards to address location needs everywhere.”
Apple developed IMDF to provide indoor maps for venues, so Apple Maps users have an easier time navigating spaces. The company says it “offers a mobile-friendly, compact, human-readable, and highly extensible data model for any indoor space, providing a basis for orientation, navigation, and discovery.”
That integration between standards is understood as vital to the evolution of autonomous machines.
The tech may be used to solve real challenges, such as providing government agencies with indoor maps for emergency use; allowing hospitals to provide mapping guidance for patients, doctors, and visitors; and helping airports create maps that partners can customize — shops or airlines could brand the data in their own apps while maintaining accuracy.
The great indoors
Maps are information — not just to help you navigate between points A and B, but to teach intelligent machines to navigate spaces contextually and safely. When you stop to consider it, you’ll realize that Maps are not fixed.
What they depict may be static, but life takes place in these spaces. Accidents happen, lanes get blocked, crowds form, weather and traffic conditions may change. To be really useful, mapping data must be contextualized by circumstance, which, to some extent, IMDF does. (It can understand location using Wi-Fi at present.)
We can see how these things might evolve with a quick glance at autonomous vehicle development. Most in this space are making use of multiple data inputs — mapping being one, Lidar and video footage being another (71% of AV vehicles tested in China use Lidar), along with gyroscopic and proximity sensors, and more.
As cars hit the road and robots hit the mall, intelligent machines will use mapping data contextualized by sensor and shared data to generate accurate navigational decisions in real time.
We know Apple is working on this. Blind BBC reporter Lucy Edwards has already shown us how an iPhone 12 with Lidar helped her walk along a street. The sensor data contextualizes the map space, and the standardization of all this information may help make for a more accessible planet, as well as safe and reliable autonomous transport.
It doesn’t matter whether mapping is part of the matrix of information used to help direct a connected car through Las Vegas or a delivery robot making its way between locations in the mall, the systems must be better at what they do than the humans they replace.
Consistent access to shared, standardized mapping data, along with the ability to contextualize all available additional information, is critical. As a well-known Googler used to say (while refusing to publish the company’s search algorithms), “Open is better than closed.”
Therefore, I think the adoption of Apple’s indoor mapping format as a standard will in the future be seen as foundational to new business opportunities and voice-first interface design. If you can figure out how to guide people and machines effectively around indoor spaces, outdoor spaces can’t be too far behind.