The COVID-19 pandemic has forced every industry to improvise and innovate to build agility and resilience to get through the crisis. Now, a key Apple partner is using telepresence robots as it leads construction of a Apple’s new headquarters in London.
Apple’s telepresence robot
Apple’s billion-dollar Californian headquarters is somewhat under-used at the moment, but the company continues to invest in new centers worldwide as it continues to experience global growth. Apple in the UK has taken over a large part of the globally recognized Battersea Power Station for its new UK headquarters.
These plush new offices are currently under construction by Foster + Partners — who also built the U.S. HQ and this week introduced us to the on-site robot. (Technically, it’s not Apple’s robot, but it’s hard not to imagine the company making use of the data it generates.)
Manufactured by Boston Dynamics, the Spot robot is being used to oversee construction of the Battersea project — and is also an excellent example of several other technologies Apple is currently developing for mass market use.
What Spot does and why it matters
As you can see, it is being used to scan the construction site and monitor progress. I imagine key project managers at Apple responsible for the project use this to virtually visit the site to keep an eye on what is going on.
You could see this as the ultimate Zoom conversation, one in which the interlocutors can direct what the camera is looking at, and where, in a remote location.
It’s also of importance to the construction industry as it enables:
- Enhanced social distancing.
- Regular and accurate construction tracking and measurement.
- Early identification of errors, since more people can access the data than can possibly visit the site at one time.
- A monitoring system for any decisions that may deviate from the design.
- Efficiency in logistics and procurement as problems may be more easily identified.
This is a great illustration of the efficiencies that can be unleashed through digital transformation.
LiDAR, AI and Industry 4.0
Spot also appears to be a useful articulation of what can be done with machine vision and LiDAR sensors. The architects have said the system is gathering imaging data to build a 4D “digital twin” of the location that is updated as it is built. It performs weekly scans.
I think of this as working like an iPhone 12 Pro Max with LiDAR might work. This is illustrated on one of the high-end iPhones by apps such as Primer or Canvas, which use LiDAR to build extremely accurate indoor maps to help you redesign internal spaces, or Wayfair or [AR]T Museum which let you place objects (former) or classic works of art (latter).
This use of LiDAR sensors has obvious use in internal design and architecture.
To infinity and beyond?
“The ability of Spot to repeatedly and effortlessly complete routine scans in an ever-changing environment was invaluable, not only in terms of the consistency but also the large amount of high-quality data collected,” said Foster + Partners partner Martha Tsigkari in a statement.
“Our scans can ensure that very quick and accurate changes to the newly designed system could be made to accommodate the differences captured by the scans – all in a matter of days. This could result in savings both in terms of time and money.”
The idea of using automation to boost the architectural process is a microcosm of the wider use of converged technologies across multiple industries. The Boston Dynamics robot is capable of handling uneven terrain and navigating on-site areas too unsafe for humans to explore.
(Of course, humans equipped with Apple glasses equipped with LiDAR will also be able to share incredibly realistic experiences as they explore danger, from mountain climbing to deep sea diving.)
This makes systems of this kind a good fit for emergency support, disaster relief, mining, sea exploration and all the many other places in which we are seeing increasing use of drones and other automata today. Highly accurate machine vision tools (such as LiDAR) just open up even more possibilities. In space exploration, research is already taking place that may lead to use of LiDAR sensors to create highly accurate maps of the moon.
Meanwhile, back on planet Earth, Apple continues to combine machine vision intelligence, LiDAR and augmented reality to provide increasingly realistic AR, including support for tactile experiences.