Some of you has already seen my latest experiment that I published via twitter some days ago. Nothing new since then, I just want to put it up here as well. This is my second time trying out three.js. It’s a very competent framework with lots of features and examples available. If you got an idea, it’s blazing fast to put together a quick demo.
The inspiration for this experiment is planted in my own garden. The picture to the left shows the beanstalk growing with massive speed. The virtual one uses the object from the last demo as a base, but without the recursive children branches. It’s a custom geometry object, so I can control the initial shape and uvs, and save references of the vertices to different arrays for later manipulation. The illusion of movement is made up with points acting as a offset values for each ring of vertices in this tube (or cone), but controlling the xy-position only. Those values is calculated just as a classic ribbon, the body recursively follows the head. In each frame the offset of one joint is copied from the one in front of it. The speed is directly related to the frame-rate and the length of a segment, otherwise I have to interpolate the values somehow.
At a given interval a leaf (object exported from 3d studio max) is spawned at the position of the head and then on each frame translated along the z-axis with the length of a segment. One important thing here is to reuse those leafs, so when they are behind the camera it’s time for a swim in the objectpool for later use.
The control of the plant is made up of three components. Mouse movements, sound spectrum and a circular movement. The style changing every 15 seconds and blend these component in various ways. The sound spectrum, or amplitude, is exported with this little python-snippet from @mrdoob.
I made some tests with post-processing effects like bloom as well, but the result was not what I wanted, so I ended up disable them. The bloom-effect is still in the source though if you want to see how it’s done. (forked from ro.me-project BTW…)
I having some problems with artifacts between intersecting object though. I have minimal experience with the depth-buffer/tests and triangle-sorting to recognize if it’s a standard problems or caused by me or the engine. Depth-buffer resolution is my guess, to long range between far and close and to few “steps” in the depth-texture object. Maybe it’s different on different machines and hardware. If you have a clue, just let me know right?
I really want to do real projects with real 3D graphics soon. I envy you that can put on the label “a chrome experience”. However, I hope these little demos made by the community helps to spread the knowledge and inspiration that you can do more than games with this technique. Some of you got clients with technique friendly target groups with the latest browsers installed or who knows where the update button is. Lets help the future and start accelerating some hardware!
Enough talking, show me the demo already!
Some more pictures: