3D Scanning at Home

DZone 's Guide to

3D Scanning at Home

You don't need fancy equipment. You can conduct 3D scanning at home, with the equipment you already have.

· IoT Zone ·
Free Resource

I used to think that 3D scanning was something that had to be done in a lab or using expensive equipment, but thanks to Steve from CG Geek, I learned that it can be done with some awesome free software and equipment that I already have.

In his tutorial, Steve demonstrates how you can capture a large object, and I highly recommend watching it because he explains the process very well. What I'll be discussing in this post is capturing a small object using a slightly different technique which should be easier for this size. If you prefer to follow along with a video, I got you covered:

I am going to use is a technique called photogrammetry, where you can take a bunch of reference pictures, run them through your software, and get a 3D model out the other end. 

1. Pick Your Subject

First, choose your subject. Miniatures are almost the worst case scenario for this kind of work because they are very small, and the detail is tiny. We're talking fractions of a millimeter here. Add that to the fact that I'm going to be running a painted miniature through this process, and it makes it even worse.

The first miniature I tried to scan was this one I sculpted about 20 years ago.

Unfortunately, it has very little detail for the software to latch on to, and it includes parts covered in metallic paint and a layer of varnish, which totally confused the software. The results were, let's say, trippy.

Instead, now, I'm going to try with this Games Workshop zombie, purely for an experiment; people, do not copy copyrighted stuff. It's not cool. This model has more clearly defined detail and color variations, which should help.

2. Lighting

Step two: lighting. To begin with, I taped down my homemade light box to a solid surface so it wouldn't move as I worked. The lightbox ensures that the lighting is consistent and that the object is properly illuminated. I am not using a background in the lightbox here since the depth of field seems to help the process. Also... well, the results that contained a background were kind of funky.

If you do not have a lightbox and don't fancy making or buying one, don't worry — just use indirect sunlight — an overcast day is perfect for this. What you are looking for here is even lighting with no weird shadows.

3. Set Up the Camera and Subject

Third step: photograph the mini at consistent angle intervals. From experiments I did yesterday, I determined that my hands were not good at this level of precision. Instead, I made a small jig that I could use to turn the mini 10 degrees each time. This is literally just two paper circles with angle markings held together with a pin

This allowed me to rotate the miniature around a fixed center, and by lining up the markings on the smaller and the larger circle, I could guarantee that the rotation was a consistent 10 degrees. The numbers on the segments were added in case I had to re-take any angles. Since the pin stuck some way out of the surface, I also added a bottle cap so the mini would sit flat, and taped the mini on top.

With the subject in place, I set up the camera on a tripod and made sure that all the settings were set to manual. This ensures that all the settings remain the same through the shooting. I used my video camera to take stills here, though you should be able to get better results from a photo camera or even some mobile cameras since the sensor tends to be larger.

4. Start Snapping Photos

Step four: I started snapping photos. After every photo, I rotated the subject 10 degrees. Then, I repeated this a couple of times from different heights.

5. Run the Photos Through Meshroom

Step five: I took my SD card and imported all the photos into Meshroom. This software is free to download and use, and you can find the link in the description below. It is a magnificent tool; honestly, to the team who made this: You blew my mind. It's literally a case of dropping the photos in the application and hitting start.

In my first run through, most of the model looked great, but it was missing small parts like the jaw and fingers on the left hand. This corresponds to areas where the software couldn't extract enough information from the image. I was still very impressed by the quality of what did get modeled, so I tried to bump up the definer preset to high and made another run with the same set of images to see if I could get a better model.

Unfortunately, that stopped working because it ran into a known issue, a known bug in version 2019.01. No big deal. What I did was I went back and downloaded version 2018.01 and set it running again, and worked fine with the high preset.

I don't think that's a big deal for me, and to be honest, if I had used a better camera in the first place, I don't think I would have needed the high preset anyway.

I'm also told that the next version with the fix for this will be out in the coming months; to be honest though, if you're going to try this, what I would suggest is get 2019 anyway, and chances are that if you're using a decent camera, you're not going to have this problem anyway.

The model was completely scanned this time, with only a couple of artifacts around the armpits and near the crotch; this is great for a scan of a miniature, and it's stuff I can clean up in blender anyway.

When importing a model scanned with Meshroom into Blender, it's a good idea to switch to the Cycles renderer, as this allows it to import and attach all the texture nodes.

Considering this is a photo scan of a 28mm mini with an OK camera, what the software did was pretty amazing.

And that's it for this experiment. Of course, I barely scratched the surface here and there's a lot I need to learn about preparing images and tweaking settings. However, I am now confident that this is doable.

I hope you found this useful, and if you have any comments or suggestions, I'd love to hear them so do leave a comment below.

3d ,3d scanning ,iot ,scanning ,tutorial

Published at DZone with permission of Karl Agius , DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}