Stereographic Lampshades

I saw an article on the Guardian website here on the 3D-printing of shapes which project interesting patterns of light. Ignoring the strangely forced Halloween reference, I thought this would be an interesting project to attempt for an arbitrary pattern, perhaps as a personalised lampshade. Buoyed by the continuing high of leftover sweets from Friday night, let’s have a look.

Much like the guys in the article linked above, we want to take a projected image and work backwards to see what kind of shape can be projected to form that image. For simplicity, lets assume the image is formed by placing a light source at the top of a sphere of radius a as depicted below. This sphere has some holes cut into it which project light into a plane a distance h below. The image on the plane is then the stereographic projection of the image on the sphere. We will need to work backwards – given a projected image with coordinates (x,y), can we work out the pattern of shapes cut into the sphere?

ProjectionSetupObviously we can only make black-and-white images using this technique, though it would be interesting in the future perhaps to consider a sphere made of transparent printed plastic (for example) which projects a colour image.

A coordinate (x,y) on the image plane can be represented in polar coordinates (r,\theta). The ray from the light source to this point then propagates at angle

\tan\theta = \frac{r}{a+h}

and with some elementary geometry we know that this ray intersects the sphere at an angle \alpha = 2\theta. With a new coordinate system centred at the centre of the sphere, the image coordinates are mapped to new coordinates

x_{new} = a\sin\alpha\cos\phi

y_{new} = a\sin\alpha\sin\phi

z_{new} = a\cos\alpha

This is all the simple maths required (though of course this is an interesting topic to examine more deeply), the rest is getting an image into the correct format.

As an example, let’s use the following pleasant flowery image from Google image search:



The first step is to binarise the image, and figure out which parts we want to keep and which we want to cut out to form the holes in our lampshade. Here it’ll look best if we cut the black parts out, so the lamp projects a nice flowery pattern. To avoid any funny edge effects I’ll crop the pattern to a circle, and so our ‘mask’ is drawn below, along with the relevant snippet of Matlab.

P = imread('Pattern.png');
P = 1-im2bw(P); % Keep black parts
Nmin = min(size(P));
% Crop into square, then circle
P = P(1:Nmin, 1:Nmin);
[xg, yg] = meshgrid(1:Nmin, 1:Nmin);
P((xg - Nmin/2).^2 + (yg - Nmin/2).^2 > 0.99*0.25*Nmin^2) = 0;
% Create a small border
P = padarray(P, [1 1], 0);


Now, this is a bitmap image and we will eventually need something 3D-printable, i.e. in vector format. I need to go through each of these holes and trace a contour around it. First, however, I need to isolate each hole separately. There is, as always, a useful inbuilt function just for this purpose which identifies distinct white regions of a monochrome image and which pixels they correspond to. In the snippet below I loop through these regions and create a new image containing a single hole. Before I make the contour I smooth the image slightly to avoid any ‘staircasing’ of the contour line.

CC = bwconncomp(P);

for n = 1:CC.NumObjects
% Create new image for each hole
newP = zeros(size(P));
newP(CC.PixelIdxList{1,n}(:)) = 1;
% Smooth image slightly
newP = filter2(fspecial('average',3),newP);
% Get contour around hole
C = contourc(newP,[1 1]);

There is one more thing to consider if we want decent performance, and that is the resolution of the contour. Matlab will quite happily produce an extra contour line for each pixel, which is much more than we need. We can apply the Douglas-Peucker line simplifcation algorithm to each hole to drastically reduce the number of segments, using the File Exchange submission here. As illustrated below, we can reduce the number of points in each contour drastically without significantly worsening the image.


Inserting this into the loop above, we get the vectorised, simplified, segmented holes which are plotted below and coloured separately. In this case there is one large hole and a few smaller ones.


This is great so far but at the moment we only have the outlines of the holes, we now need to fill in the blanks. The boundary of our lampshade is a simple circle, and the holes inside are a series of complicated contours. Fortunately, this kind of problem has been solved many times before. In the field of finite element analysis a potentially complex domain needs to be discretised into a mesh for numerical computation. Here I borrow the extremely useful submission Mesh2D from the File Exchange again, which will happily crunch through the boundaries I’ve defined and generate a mesh. This function is remarkably easy to use, so there isn’t much to say about it – it just requires a list of vertices and edges.


Now it should be clear why I was worried about the number of elements in the contour lines – each line segment forms the side of a triangular mesh element. If the contours contained many times more segments, the mesh would become very dense and slow to generate.

Happily we’re almost there now, it just remains to warp this mesh into the form required. As each mesh element is a triangle, we can move the three corners wherever we like and they will still lie on a plane together, so there is no need to re-mesh. For each vertex we apply the transformations listed above, and pack the faces and vertices into a Matlab patch object:


Nice! This looks pretty, but is it correct? And could it actually be printed? To go further, I take this mesh and import into Blender. Here I can solidify the mesh to create some physical thickness, and add a light source in the right place to check the stereographic projection.


Poor-quality render aside, this looks good. The blurring is due to the finite size of the light source rather than any depth-of-field I’ve added. If we compare the image from above and the original contour lines, we see that the stereographic projection is correct, and the original image is reproduced.



This is great, and now we know everything works we can explore a little. What if we fix the size of the image  but change the size of the lampshade?


In the small-image-size limit the projection is just a simple linear projection, and the lampshade is reduced to a simple circle.

What about some different patterns? Turning again to Google image search we can do a geometric pattern:


Or perhaps a lamp to replace your wallpaper (this one isn’t strictly possible to make…):



Finally, as a Bristolian at heart, how could I not include a stencil from the worlds favourite graffiti artist (edit: as pointed out on Hacker News, this isn’t actually a Banksy! The shame. It is authentic Bristolian graffiti though so I’ll let it stand). This might also be the first time ever that a panda bear has been meshed with an FEA mesher…


If you’d like to play with these meshes yourself, the .blend file containing the models above can be found here. You should be able to export to .stl files if you’d like to 3D print anything – if you do, let me know! I don’t have access to a printer here.

Update: Below in the comments, Aleksi has printed the panda lampshade with great results! Here’s one of the pictures of it in action:

2014-11-04 13.50.11





52 thoughts on “Stereographic Lampshades

  1. I’ve not got a 3D printer, so I wonder how easy this would be to project from a cube instead of a sphere. Then the lamp shade could be made from five sheets of card instead. Might not be as pretty as the spherical lamp shades though!


    1. Not a bad idea! The simplest way would be to initially partition the image into 5 pieces which individually map to a square face of the cube, and adjust the warping procedure slightly for each one. I might give this a go in the future as I really want to get something physical to play with – thanks for the idea.

      Liked by 1 person

      1. Thanks for the link! That looks like a fun idea, pretty much the same as these spherical designs but using flat sections. The problem boils down to calculating the shape of a shadow projected through a hole by a point source, which is something called a ‘projective transformation’ (it requires a bit of matrix manipulation). One thing you could do is create the right pattern by experiment – perhaps drawing on a transparent piece of paper with a sharpie until you get the projection shape that you want. You could then cut out the shape as required. Let me know if you get it to work!


  2. Boom. Intriguing models! I’m not a big on the math part, but I have a decent sized printer to test em out. So blender crashes at startup every time on my x64 (even the older versions) so I converted the .blend file to .obj with an online converter . Took the .obj to Meshlab and picked only the panda model to lessen the memory burden on my ol faithful (and my good friend is a big fan of pandas) and exported it to .STL. Then took the model to Nettfab to clear out any erros on it, but for some reason the autorepair cannot repair the model correctly as it infills the part after exporting and entering slic3r. Maybe if you’d have the original file as an .obj I could test it out again?


  3. Yeah, the panda’s printing at the moment. Could you upload the other ones with thicker walls aswell? I’ll paste some photos tomorrow.


  4. Wait, I was promised stereographs. Get back with the 2-color or polarized projection through reflective masks next pumpkin harvest, then…


  5. Hi! I’d be interested in making a prototype of this via milling and assembling instead of 3D printing, given the tools I have access to. That said, would you be interested to look into the difference in projection comparing the lampshade of which extrusion is parallel to ground vs. normal to the surface vs. parametrized extrusion? 🙂 thanks!


      1. Hi, yes that is what I meant, but those variations would only affect the subtle detail of the shadows outline on the ground.


    1. Ah, I see. I did think about this effect, as using the Blender ‘solidify’ modifier extrudes normal to the surface. As you say, this isn’t in the direction of the light rays so will cast some ‘wrong’ shadows. It would be possible to do the mesh extrusion in Matlab too and avoid this problem, but it isn’t something I’ve gotten around to yet, only because I’d have to work out which mesh edges are external vs. internal. This should be possible as I know the location of the holes before warping, I’d just need to keep track of the external mesh edges after warping and generate the new faces and vertices manually.


  6. Here’s some pics of the printed panda! I turned the model 180 degrees on x-axis to get a better base for printing. Scaled the original model 1400% and used 0.1mm layer resolution. Took about 4 hours. Even thou the peak of the print got a bad print quality otherwise the print went nicely and the configurations on the shades’ surface you made are nicely visible.

    Even tho the silver gray PLA I used is quite thick, whilst placing the model near a light source there were some light leaking. But even my two year old daughter said: “Look dad, there’s a panda on the wall!”

    The flower1, flower2 and the geometric model had some problems on them. Tried fixin’ but with no luck. Also printing the other models without supports would be really hard. The amount of bridges and overhangs…

    To get the printing time down on the panda, I would suggest losing the surface configurations and make the outer shell completely round. The light leaks would be correctable with thickening the walls even more.

    You’ve got a very nice idea here! I’ve got a printer with 600x600x600mm printing area and I’d love try printing the panda or some of the other models you make in a grander scale at some point!


    1. Hi Aleksi, that looks great! I can see I’d need to up the thickness a little and smooth the mesh more, but that’s worked really well. Is there anything in specific which would help the printing? Should I round off the edges or anything?


    1. Then just enjoy the pictures! Seriously though, if you wanted to learn more, most of the maths is trigonometry, along with spherical co-ordinates (which are just another application of trigonometry). The hardest part (for me) was getting everything in the right format for a 3D printer.


  7. Heya! Let’s try again with thicker walls, round outer surface and of course you could try softening some of the edges!


    1. Thanks! And nice work, that turned out surprisingly good. Is that from the original .blend/.stl file at the end of the post? I know Netfabb had an issue with some of those meshes.


      1. It’s the .stl with thicker walls you posted in a comment. I had to simplify it with meshlab before the makerbot software would accept it.

        Here’s the file I printed:

        I think it would come out really well if printed in a larger size and with supports, but the print time would be 30+ hours on the makerbot.


  8. This is really great ! Many thanks for sharing your approach and the files as well. I am also going to see how successfully they print. I am particularly interested in seeing how easy it would be to set up a not too technical workflow, a step by step, so that children can create their own picture or pattern which could then be transformed (for them) into a 3D printable lamp file. Am I right in understanding that you managed to do everything using Matlab and Blender ? As MatLab is sadly prohibitively expensive for this one application what are thoughts, if any, on being able to do this using GNU Octave?


    1. No problem, glad you like the idea. The 2D mesh creation was all done in Matlab, using a couple of tools also written in Matlab. To extrude the mesh into a 3D shell I used the ‘solidify’ modifier in Blender.

      It looks like you’ll be fine with Octave though, the three main steps can be done using the following it seems:

      Identify image regions with bwconncomp

      Generate contours:

      Simplify and mesh contours: or

      I don’t have any experience using Octave, hopefully these links are useful anyway!


      1. Many, many thanks for responding (I wasn’t notified hence my delay back)
        I will take a look at the links and see how far I get really appreciate this.


      2. I’m 8 years late to the party but was able to partially get this code running in Octave. I successfully got to the point in your article where it is stated: “This is great so far but at the moment we only have the outlines of the holes, we now need to fill in the blanks”. I am satisfied that the plots look correct but I found that the Mesh2D “refine2” function fails as follows:

        Refine triangulation…

        |ITER.| |CDT1(X)| |CDT2(X)|
        QH6235 qhull error (qh_memalloc): negative request size (-161
        0578032). Did int overflow due to high-D?

        I uncertain the moment what these lines of code do but the refine2 function fails. = @(x,y) 0.05*(1 + ((x.^2 + y.^2)/a^2)).^2;
        [p,t, tria, tnum] = refine2(allnodes, alledges);

        It seems that for Octave, because qhull fails, a different solution may be required for the constrained Delaunay triangulation of the polygonal region {NODE,EDGE}. Bummer.


  9. Hey there…would you actually be so kind as to share the matlab files which you used to generate this? Drop me a mail if the answer is a Yes… 🙂


  10. I’m trying the flower ones but the shapes are really not designed to be printer. So many floating objects and overhangs! I’ll let youj know how it turns out.


  11. Hi,

    I am enjoying going through your tutorial. but I could not understand how did you wrap the 2D image into a 3D one? Can you you help me out?


    1. Hi, the 2D -> 3D conversion comes from the equations at the top of the blog post. Each point in the image has (x,y) coordinates, but no z coordinate. For each point you have to calculate new (x,y,z) coordinates which correspond to a stereographic projection, which is what the equations represent.


  12. Im having issues , using you’re code
    P((xg – Nmin/2).^2 + (yg – Nmin/2).^2 > 0.99*0.25*Nmin^2) = 0;

    this line gives me the error
    Error: File: Try1.m Line: 7 Column: 42
    Unbalanced or unexpected parenthesis or bracket.

    could you please help me??(or share the matlabs files/full code)
    so I can try it…
    thank you very much


  13. Hey,
    I’m trying to use the code you uploaded , but having issues with the new mesh2d , version ( after installing it the code dosent work), “Undefined function ‘mesh2d’ for input arguments of type ‘double’.”
    could you please help me with that?


    1. Hi James,

      Once you downloaded and extracted the Mesh2d zip file, did you add the folder to your path? The mesh2d.m file should be in there, along with lots of other functions.


  14. Hey,
    After checking , the new version of mesh2d use refine2 instead of mesh2d function…
    also I have problem with stlwrite function
    Error using reshape
    Product of known dimensions, 9, not divisible into total number
    of elements, 28362.

    Error in stlwrite (line 81)
    facets = reshape(facets(:,faces’), 3, 3, []);


    1. Hi James,
      Did you happen to figure out the solution to your problem. I am encountering the same issue. I am wondering if it is because the app ” MESH2D: Delaunay-based unstructured mesh-generation ” has been updated and a part of it is now incompatible.

      Refine triangulation…

      |ITER.| |CDT1(X)| |CDT2(X)|
      Warning: Duplicate data points have been detected and removed.
      The Triangulation indices and constraints are defined with respect to the unique set of points in delaunayTriangulation.
      > In deltri2 (line 103)
      In refine2>cdtbal0 (line 311)
      In refine2 (line 260)
      6 848 861
      10 964 3550
      20 1036 6040
      27 1037 6105


  15. Hi,
    For a school project we are making a stereographic lampshade. We have created a .mat file containing the coordinates for the sphere. Unfortunately we don’t know how to upload this file into Blender. We saw in your blogpost that you uploaded a file from Matlab to Blender, so maybe you can help us out?
    Thanks in advance!


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s