If you’ve read my previous 3D Scanner Review of the 3D Systems Cubify Sense you know that I was impressed by the ease-of-use and geometric details for a device priced below €450. But the quality of the color information — or textures — the Sense captures are completely underwhelming.
In this Review I’m testing the Sense’s mobile brother, the iSense*. At least thats how 3D Systems rebranded the device. Its original name is Structure Sensor made by a Occipital. I’m testing that original version, which I got from the Dutch 3D Printing and 3D Scanning Store MakerPoint.
Cost-wise the Structure Sensor a bit more expensive than the Sense: The Structure Sensor itself costs €440, but you’ll need a €60 bracket to attach it to a compatible iPad. And then you’ll need a compatible iPad of course! It’s compatible with all iPads newer than the 4th Gen iPad and iPad mini 2 (previously known as “iPad Mini with Retina Display”) — including recently added support for the iPad Pro 9.7″ & 12.9″.
I’ve tested it with the least powerful compatible device, the iPad mini 2, which has a 5 megapixel camera with an aperture of f/2.4. Since this camera is used to capture color details, it’s safe to say that using a newer iPad will result in better texture quality. This iPad Pro 9.7″, for example, has a 12 megapixel camera with a faster f/2.2 lens. That being said, I think that testing with an iPad mini 2 is a great benchmark and this iPad is still being sold for €265, bringing the total minimal costs of the Sensor + Bracket + iPad to €765 — which is still a lot less than many other 3D Scanners. (All prices I mention are in Euros and include 21% Dutch VAT).
I’ve tested the Structure Sensor with 3 different applications, which I’ll cover in 3 different parts of this Review:
- Occipital’s own iPad apps
- The third-party itSeez3D iPad app and
- Occipital’s Skanect software for Mac and Windows.
Setting up the Hardware
Attaching the Bracket
Firstly, you need to attach the Structure Sensor to the bracket—both are available with silver or blue aluminum—for your device. The sensor includes 4 screws and a screwdriver. The bracket can be an official one but you can also order 3D Printed versions through Shapeways or download one from Thingiverse and 3D Print it yourself. The Structure Sensor is a very “open” and customizable system this way. Occipital has even placed CAD drawings on its developer website so you can design your own bracket. Developers that want to experiment with the sensor beyond using it with an iPad, can also buy a €50 USB Hacker Cable to attach the sensor directly to any chosen device.
3D Printing a Lens Cap
The brackets are designed to be easily removed from the iPad. This is nice, because like me you’ll probably have a case or sleeve of some kind to protect your iPad during transport. Unfortunately, the Structure Sensor doesn’t come with any kind of carrying pouch. And while the bracket with the sensor attached is sturdy enough to transport in one of the side pockets of my laptop bag, I don’t want any scratches on the glass. A lens cap isn’t included so I downloaded this Structure Sensor Lens Cap from Thingiverse and 3D printed one myself. If you don’t have a 3D Printer yourself, you’ll probably live near a 3D Hub that can help you out.
Calibrating the Sensor
Before using the Structure Sensor for the first time, it has to be calibrated. This is mainly because the iPad’s native rear-facing RGB camera is used to capture the color information and the location of the camera varies on different iPad models. Because of this the offset between the iPad camera and infrared camera on the Structure Sensor has to be compensated.
There’s a dedicated Calibrator app that makes the process very easy: just go outside on a bright day and point the sensor towards something with a lot of detail. The app will track some details automatically to do the biggest part of the calibration and let’s you fine tune it afterwards if necessary.
3D Scanning with the Occipital Scanner App
The native Scanner app is pretty straight forward. Simply point it at the person or object you want to capture and use a two-finger gesture to scale the indicator box so it matches the size of the subject, which will also be highlighted.
There are only two options:
- Scan using the “Old Tracker” that only uses shape information or use the “New Tracker” that also uses color information to keep track of the object you’re scanning. I see no reason to use the old tracker, because scanning without color information make it hard or impossible to scan uniform objects without many geometric details.
- “Low Resolution” or “High Resolution” Color. From my tests, the difference between the two is that the former outputs a 2K (2048 x 2049 pixels — or 4.2 megapixels) texture map and the latter a 4K (4096 x 4096 pixels — 16.8 megapixels) version.
The high resolution version does take a bit longer to render, or calculate. This is done on the iPad itself and my iPad mini 2 is one of the slowest compatible iPads. The exact time depends on the size and complexity of the subject. During this time you can’t make a new scan.
Scanning with a turntable
You can make 3D scans either by freely moving around an object or by using a turntable. Below is a video of the scanning process with the help of a €6 IKEA SNUDDA Turntable and a €4 tablet stand from Xenos. It’s played back at 4 times the original speed (hence the high-pitched 3D printer sound in the background).
As you can see I put Teddy on top of a book (the marvelous The Art of The Last of Us, to be precise). This makes it easier for the RGB tracker to correctly track the object while rotating. The app particularly had a hard time tracking the side view of the bear without the book. Apart from the natural light from the windows in the studio, I only used a 55 Watts light with a small softbox (visible on the right in the video).
When the rendering is done, you can preview the scan and export it though email as a .OBJ 3D model with separate texture map in .JPG format. Unfortunately there’s no way to save your scans on the iPad, so you need to have an internet connection to email the scan before making a new one. The .OBJ gets emailed as a .ZIP containing the model and texture files. The .ZIP can be uploaded directly to Sketchfab:
I was surprised by the scan quality. Especially taken into account that it took no time to “render” the geometry and only a minute to render the textures on a relatively slow iPad mini. The color quality sits somewhere between the blurry textures of the 3D Systems Cubify Sense (click for reference scan of the same model) and the sharper result of Photogrammetry with the free Autodesk 123D Catch app. I would say a scan like this is usable for many non-industrial purposes and you’d probably get better textures from a higher-end iPad. By scanning like this from a fixed angle I wan’t able to capture under Teddy’s arms.
For the next example I chose a bigger and less fluffy object: a small oil drum which has been modified into a basket (in Bali through Fair Trade store Jansje). I placed it on yet another of my daughter’s cheerful—and perfectly trackable—placemats on our dining table and walked around it freely, stopping every once in a while when asked to by the app. At those stops it takes a photo for the textures. I was surprised how smooth this experience is and how well the software tracked the object—it never lost it! And this is with the last bit of afternoon daylight and two dimmed ceiling lamps, so far from perfect lighting conditions.
Below is the result of the scanning session above. As you can see it didn’t completely capture the handle on the lid but the handles on the sides are fine. The texture okay, but not as sharp as I’d hoped. The stitching of the textures quite good, though.
Since I asked my business partner Patrick to model for the Cubify Sense 3D Scanner Review, I asked him to wear the very same shirt to the office for a perfect comparison. As you can see he was very happy.
As you can see below, the result is a lot better than that of the Sense. The resolution is a bit low—both in terms of geometry and texture—but it required no rendering after scanning for the mesh and only a minute for the texture information. To me it kind of looks like a video game character. It also makes Patrick look 10 years younger…
Update June 27th 2016
Occipital just released a new version of their Scanner app which uses their wel 0.6 SDK. The update promises higher quality meshes—a 60% resolution improvement on each of the 3 axis, actually—partially due to their acquisition of Lynx Laboratories.
Naturally, I wanted to test if this is true!
At this size, the polycount was 50.000 vs. 44.000 with the old app based on the 0.5.5. SDK. A 15% improvement, but not really visible. So I scanned my loyal test subject and business partner Patrick:
At this size the upgrade in resolution is very apparent: 24,169 vs. 56,427 faces with the new app—230% more polygons! Even without the numbers, there’s clearly more detail in Patrick’s face (he likes the old mapper better) and T-shirt.
It’s impressive that this kind of improvement can be achieved with just a software update. And remember it’s all rendered locally on an iPad—in realtime—no cloud processing. And according to the Developer Program Manager of Occipital, this is just the beginning:
— Mark Piszczor (@mpiszczo) June 27, 2016
It’s worth mentioning that since this update to SDK 0.6 there’s no longer support for the 3D Systems iSense I mentioned in the into. So the new Scanner app only works with an actual Structure Sensor bought from Occipital or one of its resellers.
The Structured Light technology that the Structure Sensor uses to capture depth is great for 3D scanning indoors. Trying to capture objects outside the comfort of my studio I quickly ran into the limitations. As you can see in the calibration image in the beginning of this post, the infrared camera has a hard time capturing details in bright sunlight, let alone see the projected laser pattern.
So I did some experiments on a cloudy day—of which there are enough in The Netherlands. Below is a typically Dutch litter bin.
Again I was surprised by the smooth tracking: even without a flat floor surface the litter bin was tracked smoothly. It was a bit hard to capture the back without falling into the canal, but I managed quite well because it apparently doesn’t matter if the object goes out of frame for a moment, because its surroundings are tracked as well.
Below is the result of the scan. It’s far from perfect, but I find it especially interesting that there’s a hole where the icon is printed. Apparently the dark green of the bin absorbs enough sunlight for the infrared sensor to detect the laser pattern, but the white icon is simply too bright. There are also a lot of particles flying around which I’m pretty sure weren’t present in reality.
Below is another outdoors scan of a Dutch mail box when there was a bit more sunlight. The shadow side of the object and the wall are captured decently, but the side that’s facing the sun was totally invisible to the Structure Sensor. The result is rather interesting, but not usable of course.
3D Scanning interiors with the Room Scanner App
The last app I’ll cover in this part of the review is Occipitals Room Scanner App. It’s completely foolproof: use the slider to visually indicate the size of the room you’re about to scan, hit scan and move around until every surface is covered by green polygons. Unfortunately, the app forces you to stay more or less in one place, so you can’t walk around a room freely to scan around corners.
Its usefulness greatly depends on the purpose. As you can see below, the output quality is too low—both in terms of geometry and texture—to use for any purpose that needs to be esthetically pleasing (you can look around in the interactive Sketchfab embed below by changing the navigation from Orbit to First Person with the icons in the bottom-right corner)
However, I can think of a few purposes for which the Room Scanner app is useful. For instance, game level designers can use it as a dimensionally accurate reference to create an interior for a video game. It’s also very handy to take quick measurements, because making a scan only takes a minute and the app has a built in measuring feature.
Wrapping up Part 1
It’s good to realize that Occipital’s Scanner apps are samples to demonstrate new features that are announced every few months. For example, in March 2015 SDK 0.4 was introduced, which offered the New Tracker I wrote about earlier, as well as the ability to output UV-textured meshes. Earlier versions apparently used per-vertex coloring that delivered the same low-quality color information that disappointed my in my 3D Systems Sense Review.
So… the Structure Sensor has an SDK that allows third party developers create their own apps for all kinds of purposes. For example, it can be used to create Virtual Reality & Augmented Reality experiences. However, for this Review I’m merely using it as a iPad-based 3D Scanner and for that purpose there’s only one third party app—itSeez3D—which uses cloud processing for more detailed scans.