Surface reconstruction by warping the mean mesh

Hi Alan, Shireen, and the ShapeWorks community,

I am using the data augmentation methods built into ShapeWorks to fit a statistical distribution to my shape model and then sample some random shapes from it. Now I would like to create surface meshes for those randomly sampled shapes using the same technique that is built into ShapeWorksStudio (warp the mean surface mesh).

My question is this: Is there a convenient function wrapped for Python that I can call to do this? Failing that, where would I look in the ShapeWorks codebase to find the surface reconstruction code?

Thanks in advance!

The relevant shapeworks command is “warp-mesh”:

shapeworks warp-mesh --help
Usage: warp-mesh [args]...

warps a mesh given reference and target particles

  -h, --help            show this help message and exit
                        Name of reference mesh.
                        Name of reference points.
                        Optional Argument to specify the name of Landmark file,
                        if landmarks are available for warping
  --target_points <list of strings>
                        Names of target points (must be followed by `--`), ex:
                        "... --target_points *.particles -- ...
  --save_dir=STRING     Optional: Path to the directory where the mesh files
                        will be saved

You will want to supply your reference mesh (e.g. mean mesh), reference particles (mean particles) and then the target particles.

An example for the Python API is given here:

Thanks Alan. This is exactly what I was looking for!

Hi Alan, all,

I am now able to successfully run the augmentation and warp meshes:

# Run the augmentation
embedded_dim = DataAugmentationUtils.runDataAugmentation(path_aug, img_list, local_particle_list, num_samples, num_dim, percent_variability, sampler_type, 0, 10, world_particle_list)

When warping the mesh, I carry along a set of landmarks that I annoted in SW Studio on my reference mesh (in this case the median mesh and its particle distribution):

# Warp the meshes
reference_particles   = shapeworks.ParticleSystem([path_to_median_particles]).ShapeAsPointSet(0)
reference_mesh        = shapeworks.Mesh(path_to_median_mesh)
landmark_particles    = shapeworks.ParticleSystem([path_to_landmarks]).ShapeAsPointSet(0)

warper = shapeworks.MeshWarper()
warper.generateWarp(reference_mesh, reference_particles, landmark_particles)
warped_mesh = warper.buildMesh(target_particles)

However, for my application I only want to work in the “world” coordinate frame (and actually don’t need the augmented image volumes). I want my particle positions and reconstructed meshes to all live in the common, co-registered world space, with scale removed.

So my question is this:

The runDataAugmentation function does seem to properly sample from the embedding in the “world” coordinate frame, but the particles output of the augmentation code is in a “local” coordinate frame (corresponding to the new generated image?). I cannot work out from the code how this world-to-local transformation is computed, or where it is stored. Can you explain how this works?

thanks in advance!


Hi Josh,

Sorry for the confusion regarding the runDataAugmentation() function, we have an open issue to clarify this here: Clarify runDataAugmentation() params · Issue #2072 · SCIInstitute/ShapeWorks · GitHub

There are two ways to call runDataAugmentation().
The first is:

DataAugmentationUtils.runDataAugmentation(out_dir, img_list, 
                                          world_point_list, num_samples, 
                                          num_dim, percent_variability, 
                                          sampler_type, mixture_num)

This generates image/particle pairs in the world coordinate system and assumes the images in img_list are groomed/aligned so they are in the world cooridnate system.

The second is:

DataAugmentationUtils.runDataAugmentation(out_dir, img_list, 
                                          local_point_list, num_samples, 
                                          num_dim, percent_variability, 
                                          sampler_type, mixture_num,

This generates image/particle pairs in the local coordinate system and assumes the images in img_list are the original/unaligned images. The world_point_list needs to be provided in this case so that PCA is done in the world coordinate system. New samples are generated by sampling the world PCA subspace, then mapping it to local points using the transform from world to local of the closest real example. In the future, we could add noise to this transform as an additional form of augmentation, but right now, this is not included.

I believe for your case, you can just use the first way of calling the method where the third parameter is the world particles.
If you don’t require augmented images, instead of calling runDataAugmentation you could try the following code (will be much faster):

from DataAugmentationUtils import Utils
from DataAugmentationUtils import Embedder
from DataAugmentationUtils import Sampler

point_matrix = Utils.create_data_matrix(world_point_list)
PointEmbedder = Embedder.PCA_Embbeder(world_point_matrix, num_dim, percent_variability)
embedded_matrix = PointEmbedder.getEmbeddedMatrix()
PointSampler = Sampler.Gaussian_Sampler() # or Sampler.Mixture_Sampler() or Sampler.KDE_Sampler() 
for index in range(1, num_samples+1):
    sampled_embedding, base_index = PointSampler.sample()
    gen_points = PointEmbedder.project(sampled_embedding)
    out_path = out_dir + 'sample_' + Utils.pad_index(index) + ".particles"
    np.savetxt(out_path, gen_points)

Let me know if this works or if you have further questions!


Hello Jadie,

Thanks so much for the response. Your suggested approach of using the PointSampler object is working well for me. And thanks for solving the mystery of how the local coordinate system is chosen for random samples!


1 Like