Virtual fitting rooms with 3D body mapping add an interactive fun factor to traditional retail. Using gesture-driven user interfaces, such as Microsoft Kinect* for Windows, virtual fitting rooms give consumers a fast way to try on lots of different outfits and accessories without removing what they’re wearing. Still in the early stages of development and sophistication, these systems hold great promise and opportunities for the ambitious developer looking to make their mark. In this blog, I’m going to take a look at what makes virtual fitting rooms a must-have for retailers and how to achieve the performance levels necessary to meet customer expectations.
For traditional retailers, virtual fitting rooms provide a new way to attract customers and encourage them to spend more time in the store. Equally important, especially for the bottom line, virtual fitting rooms offer a solution to the high rate of returns on clothing due to poor fit or a buyer’s second thoughts. According to one source, up to 40 percent of clothing purchased is returned because of a poor fit.
For consumers, virtual fitting rooms provide a quick way to try on lots of different items and make better selections. For today’s time-pressed consumer, this can be a real advantage. Virtual fitting rooms are also just plain fun.
If you search online for videos showing examples of virtual fitting rooms (see this example), the first thing you notice is that some of the technology seems a bit crude. The garments seem to float on top of the body instead of truly wrapping the body and moving with it.
Fortunately, the technology is improving quickly. Bodymetrics, a London-based pioneer in 3D body-mapping, demonstrated its Bodymetrics Pod at CES 2012. A demo video of its imaging capabilities shows the ability to contour the garment around the person’s body and use different colors to show where the fabric would be tight and where it would be loose on a particular person’s body. The technology made its American debut in March 2012 at Women’s Denim Days at Bloomingdale’s in Century City, Los Angeles. Once mapped, customers can not only virtually try on jeans in the store, but they can also later access an online account to model and order additional jeans based on their body shape. Enabling customers to virtually try on jeans can be a real advantage in a store like Bloomingdale’s that stocks hundreds of different jeans styles.
Figure 1. Bodymetrics Pod.
To provide such precise body imaging, the Bodymetrics Pod uses eight Kinect for Windows sensors arranged in a circle. Encased in its own enclosure (see Figure 1), the system is secure and private. Software calculates the shape of the person, producing a 3D map complete with hundreds of measurements and contours.
To ensure customer engagement, it’s crucial to make all these calculations fast. From their personal computers to their smart phones, consumers are accustomed to sophisticated technology that provides nearly instantaneous results. To deliver that kind of performance, virtual fitting rooms require embedded computing solutions capable of handling hundreds of calculations and rendering images in real time.
The good news is it’s easier than ever to cost effectively meet these intense processing and graphics needs. The latest embedded boards using 3rd Generation Intel® Core™ processors with Intel® HD Graphics 4000/2500 deliver best-in-class compute performance over the previous generation, plus key graphics processing advantages for virtual fitting room applications.
Manufactured on industry-leading 22nm process technology with 3D Tri-Gate transistors, 3rd generation Intel® Core™ processors yield up to 20 percent better performance in the same thermal envelope as the previous generation. This means these new processors will easily crunch the hundreds of calculations necessary for accurate body mapping fast enough to keep customer engaged and excited about the virtual fitting room experience.
Intel’s integrated graphics is an important enabler of this experience. The 3rd generation Intel® Core™ processors provide excellent graphics performance—up to 2x in 3D performance—all without the need for an expensive graphics card. In fact, developers could use this processor’s ability to support three independent displays to create a fitting room experience similar to having three angled mirrors. That would really up the ante when competing with other virtual fitting room designs, allowing the viewing of different angles of the body simultaneously, as well as capturing body movements. This could provide a significant competitive advantage for stores looking to impress customers with the latest innovations.
Figure 2. Microsoft Kinect for Windows sensor unit.
Coupling the Kinect for Windows sensor (see picture) with such a sophisticated processor takes the concept of a virtual fitting room to a new level. Microsoft has made such applications for Kinect for Windows particularly easy by choosing a hardware-only business model. Microsoft will not be charging for the SDK or the runtime, making both available free to developers and end users respectively. This means developers can innovate with confidence knowing that the Kinect for Windows hardware will be supported by Microsoft, and they won’t have to pay license fees for the Kinect for Windows software or the ongoing software updates.
Want to create a better mousetrap for this segment of the retail market? I’d like to hear your thoughts on the combination of Kinect for Windows and a platform based on 3rd generation Intel® Core™ processors as the place to start. To learn how Intel® Intelligent Systems Alliance members can provide these platforms, see intel.com/go/embeddedalliance.
Microsoft is an Associate member of the Intel® Intelligent Systems Alliance.
Roving Reporter (Intel Contractor), Intel® Intelligent Systems Alliance
Associate Editor, Embedded Innovator magazine