This project took place as a summer internship in July and August 2021. The project was looking at how we can combine and synchronise physical and digital prototypes to enable new capabilities using immersive reality technologies and approaches.
Here, we used virtual and augmented reality tracking technologies to synchronise an early-stage physical prototype to a realistic digital version. Once synchronised, users are able to manipulate the physical prototype directly and see their actions replicated on the digital version, giving them a way to interact with an as-final prototype right from earliest stages of the design process.
We used a technique known as a digital mirror, where the digital model is synchronised on a screen as if it were a reflection of a physical prototype held in front of it. Users are able to move the physical object while seeing themselves in the screen holding and using the as-final version. They are also able to reconfigure the physical version with new geometry, handles, or parts, and see the changes replicated in as-final form in the reflection while also physically feeling the difference to the object in their hand.
The technology we’ve developed speeds up the process of learning from prototypes, gives non-technical users a more accessible way of providing feedback earlier in the design processes, and allows designers to get more information from lower-fidelity physical prototypes without sacrificing quality.
How it works - Low-fidelity to High-fidelity digital prototyping mirror
All files used to create this system are available on github.
When the Unity program recognises pre-programmed AR image targets, it loads the respective parts into the digital space. 3D tracking is then handled by the VR system, with a HTC tracker attached to the prototype. A high-fidelity digital model is then displayed on a screen, with the real-time tracking and dimensional similarity giving the user the perception of a functional ‘AR Mirror’.
For this example, a 3D printed drill has been split into three parts. There are AR targets on the head and bottom, while the handle acts as an anchor and is always included. Users can replace the different components with different versions to test their impact on product usability, like different ergonomic grips, or different masses in the head and bottom to test mass distribution.
Model functionality: All pieces attach to one another with basic interference ‘dovetail’ joints. AR image targets are stuck on with tape and the VR tracker is attached using a screw glued to the 3D printed part. The base of the drill is hollow and can be split in two using the snap joints. This allows the weight of the base to be customised at will.
Digital Functionality: Sections of the drill can be customised, with keyboard inputs leading to random colours being generated in each section. The program reveals parts of the drill when specific image targets are shown. Currently, there are targets for the base and the main body of the drill.
Accuracy and errors: Once running with the correct values the program exhibits few significant errors. The two main contributors to errors are calibration and dimensional accuracy.
Calibration can be split into rotational calibration and positional calibration. Positional calibration is done initially by placing the prototype at a predetermined location from the screen. Rotational calibration is done by eye, matching the angle of movement in the Unity scene to the real world.
Dimensional accuracy can be split into the screen size and the model size. The screen size is inputted in the Projection Plane gameobject according to the monitor dimensions, this is done by hand. The AR mirror effect is strongest when the size of the digital and physical models corresponds reasonably accurately, so the scale of the digital model must be tweaked (or if from CAD should be fine).
If view from tracker to lighthouses is blocked responsiveness will decrease, the model will drift and then movement will stop. Otherwise, the program is reliable, with excellent responsiveness and no tracking issues.
Future Outlook: Physical / digital prototyping is a priority work area for the group, and we’re keen to develop this and future technologies over the next few years. Our current plans include extending the physical and digital reconfiguration and manipulation options for the prototypes, different technologies for synchronising and representing the digital version, the introduction of new immersive technologies such as haptics and gesture control, and integrating lightweight real-time analytics to give further understanding of the prototypes’ performance.
In recent years there has been a lot of talk about technical debt in various fields. Technical debt is the cost that an individual or a group of people will incur in future for hasty decisions in the present, which are often motivated by lack of resources and/or time. Photo by Scott Umstattd on Unsplash […]
Photography courtesy of Peter Rosso As a first line of defence against the spread of COVID-19 the facemask, a simple covering worn to reduce the spread of infectious agents, has affected the lives of billions across the globe. An estimated 129 billion facemasks are used every month, of which, most are designed for single use. […]
The DMF lab recently (remotely) attended the International Conference of Engineering Design (ICED) 2021 to present seven papers. One of these, authored by Chris Cox, Ben Hicks and James Gopsill, investigates the new language surrounding the paradigm shift towards digital engineering. This presentation was shown at ICED 2021 as part of the “Digital Twins” panel, […]
Prototyping is an indispensable activity in the product development process but what does prototyping practice in industry look like? In this video we take a ‘snapshot’ of prototyping practice from 5 companies to see how practice has evolved and understand what the characteristics of industrial practices are. This work was presented at the International Conference […]
What are the differences in performance and usability of digitisation techniques? In this video we compare two such methods; photogrammetry and structured light scanning. For more details please see the full paper on the link here https://www.cambridge.org/core/journals/proceedings-of-the-design-society/article/comparison-of-structured-light-scanning-and-photogrammetry-for-the-digitisation-of-physical-prototypes/66038D84EF1A45F22F601B899EFC0D25. This work was presented at the International Conference of Engineering Design (ICED) 2021.
The IDEA (International Design Engineering Annual) Challenge just completed its first iteration! Inspired by current events, this year’s participants competed to develop a novel vaccine transport system to aid in the distribution of vaccines across rural Colombia. Fourteen participants from across four institutions participated. Over the course of four days, the groups worked hard to […]
Prototyping constitutes a wide range of methods spanning both physical and digital domains, each with their respective benefits and drawbacks. Different methods are used in the product development process (PDP) to generate knowledge about both the problem and design solution. A key ingredient to successful product development is ensuring that the right knowledge about the product is […]
The Design Manufacturing Futures Lab has published and presented its first paper entitled “Quantum Combinatoral Design” exploring the potential of applying Quantum Computing to Engineering Design problems. This was published and presented at the International Conference on Engineering Design 2021 (ICED21)
Written by Louise Larsson Summary of project and intended outcomes In my individual research project this term I investigated potential alternative user interfaces for computer-aided design, CAD, software and brain-computer interfaces in particular. That is looking into the prospects of making designing more intuitive and accessible by introducing brain control to the design process in […]