Dec 182011
 

Here’s a simple script that lets you ‘bookmark’ camera positions.

It’s a little redundant since the same thing can be accomplished a number of different ways but after a discussion with another BR BUG member I thought I’d take a swing. So here’s the script (right-clk>save as)

camera_marks

Just uncompress, then copy the script to your addons folder and then activate.

You get a panel in the 3DView>Toolbar that lets you add, delete, and change marks.

 

 

 

 

 

It saves the loc/rot/dofdist/lens to a textblock called CAMERAMARKS. I know this is a hacky implementation but I’m a hack and that’s just what I do. It’s super rudimentary now but it was only about 3 hours of coding and most of that was looking through old code to remember how to setup the UI, which I needed a refresher on anyway so this was a good little project. So tell me what you think, if it’s useful, what it needs to become useful, etc.

Jul 112011
 

BA Thread

What it Is – ProgrAnimals is an initial framework for variable morphology pose-controlled physics-based animation in Blender, whatever that means.

What it Does – In simplest terms this script creates a game-engine physics based ‘walking ragdoll’ from a user-defined armature, from humanoid biped to something much more bizarre. I think the game ‘Spore’ is probably the best analogy, where people can make creatures and they sort of walk or hobble based on how they’re built. This is the same notion, except much less automatic right now. The user has to define a little more about structure, muscles, and gait.

Why it Be – This is inspired first and foremost by my enduring love of doing horrible things to ragdolls, for which I make no excuses or apologies. The implementation is inspired especially by Cartwheel-3d, SIMBICON, and Philippe Beaudoin, and also by Ari Shapiro’s DANCE. The future goals of this are somewhere between Endorphin, Spore, and may also include possible robotics integration with Arduino. Ultimately I’d like to create a break-dancing robot army to help me stamp out tyranny, oppression, and tired dance moves.

This is the unexpected result of some of what I learned from the whole progranimation thing. I thought I would make a more complex pose-control system, but I ended up wanting to make a more generalized system that could work with arbitrary leg configurations. Originally I think it had something to do with making circuit components walk around on a circuit board but that’s another story, might still do that though. Anyway- this is actually a vastly simplified controller compared to Uno and Deuce in progranimation. No balance feedback, no uprightness vector, not even a collision sensor on the feet, it’s just proportional derivative controllers and a timed state-machine. It’s about the most bare-bones pose-controller I think you could make, but it’s still hilarious to watch it make stuff walk around. I hope to expand from here by giving the torso and legs considerable options as far as number of states, balance sensors, state-switching reactions, etc. First I wanted to get something functional and this is it.

I hope this will be useful and interesting, but it may take a little time to get a creature working. That said, it can be very fun and I hope I’ve made it easy enough to use that most Blender users can figure it out. Here are a couple of walk-throughs that demonstrate how the process works.

I. EDITING WALK STYLES – Changing the gait of a pre-made biped rig.

1) Download this file ProgrAnimals-I-biped.blend and open in Blender2.58a

2) Press ‘P’ to run game engine. Watch as very little happens. Press ‘ESC’ to stop game.

3) Select the iARMa-pg-0 armature (pose-goal armature). Switch to POSE MODE, and change the pose to something approximating this screenshot.

[Exact settings: BONE x|y|z rotation : THIGH0 -45|0|0 | SHIN0 90|0|0 : THIGH1 15|0|0]

4a) Run game again (P).

EXPLANATION: The left leg in the pose-goal armature is the ‘Swing Leg’, the right is the ‘Stance Leg’. They are the pose-goals that the leg will try to assume when in this state. Poses can also be stored in poseLib for convenience.

4b) while not bored: change settings; run game again (P).

5) Select the iARMa armature and switch to EDIT MODE.

6) Select each THIGH bone, change the bones Custom Property legstate = 0 on both.

7a) Switch to OBJECT MODE. Run game again (P).

EXPLANATION: legstate defines if the leg starts on swing or stance, setting both to the same makes the ragdoll jump.

7b) Change one thigh back to legstate = 1 (make the ragdoll walk again)

8} Select the iARMa-THIGHl-k empty (best to select from outliner). Look at the Scale values in the ‘N’ properties view. Change the ‘X’ scale to 10. Do the same on iARMa-THIGHr-k.

9) Run game again

EXPLANATION: The scale setting on these empties define the stiffness and damping parameters for their joints. See progranimation for a better explanation of PD controllers)

10) Edit the iARMa armature. Select the TORSO bone – change the bones Custom Properties: statetimer = .25; zgoal = ‘BALL’

11) Switch to OBJECT MODE. Run game again (P)

EXPLANATION: statetimer times the switch between states, zgoal defines the heading goal of the character. zgoal can be the name of an object or an angle in degrees using the prefix ‘a:’ ex. a:45

 

II. SETTING UP A NEW CREATURE

1) Download this file ProgrAnimals-II-quad.blend and open in Blender2.58a.

2) Select the iARMq armature.

3) In the script editor make sure the variable setupSEQ = 0. Run the script ALT-P. This script will add custom properties to the bones of this armature based on the names of the bones.

BONE SETUP/NAMES:
The script will use bone names as a starting point to setup constraints with axis and angle limits for standard joints. see code for values. THIGH, SHIN, FOOT, TOE, REVKNEE, just use these strings somewhere in the bone. The names aren’t necessary but they make setup much easier as they add the necessary Custom Properties to the bones.

CUSTOM PROPERTIES ADDED TO BONES:
TORSO
statetimer: time to switch states
zgoal: goal in degress ‘a:45’ or object name ‘Cube’
THIGHS
legstate: leg start state (0-swing/1-stance)
legtype: type of leg config (for multi leg setups)
ALL BONES:
usebone: create box for this bone
xyzmin/xyzmax: values to be used to create 6dof rigid body joints
*usebone and legtype are the only values that cannot be modified after this step

4) Select the armature, enter EDIT MODE. Make sure Custom Properties were added to bones. Select the THIGH bones. Change the value of legtype so the front legs are ‘0’ and the back legs are ‘1’. This will let the program distinguish between the different types of legs and will create pose-goal armatures for each leg type. You can have as many types of legs as you want, but for now they only have 2 states each: swing/stance.

5) Switch to OBJECT MODE. Select the armature. In the script editor set variable setupSEQ = 1. Run the script. This will create the boxes and empties for each bone. Boxes are game objects connected by rigid body constraints. Empties define stiffness/damp gains at each joint with their xyz scale values.

6) Select the armature. In the script editor set variable setupSEQ = 2. This will create pose goal armature(s) to be used to define swing/stance state pose goals for each leg type.

7) Adjust pose-goal armature poses as desired. Press ‘P’ to run game engine.

8} Design a crazy five-legged monster and make it chase some other crazy thing around.

9) Game> Record Animation. Rinse, Repeat.

NOTES FOR MAKING YOUR OWN CREATURE:
*Legs must at least have a thigh, the rest is optional. Feet are good to have though.
*Legs can be nearly any config as long as they do not branch (no double-footed legs or anything like that, a long tail is fine though).
*Use very short names on armature or bones, under 5 chars each (sorry).
*Do not use ‘-‘ (dashes) in the names of bones or armatures. I need those.
*The mass of each bone-box is determined by the length of the bone. This can make for very heavy limbs so you might want to adjust this. You may also want to change the shape of certain bone-boxes, that should work as long as you don’t change the origin.
*The settings created for THIGH, SHIN, FOOT, etc. are generic and can be changed to anything. The settings for each joint are combination of the rigid body constraint settings and the stiff/damp parameters. The initial setup of the rigid body constraint is defined by the xyzmin/xyzmax setting on the bone, you can change these Custom Properties before you create this bone-boxes, or you can adjust them in the constraint settings after they’re created.
*Characters should be created facing Y+ forwards.

 

III. TIMER OFFSET – Offsetting state switch timing.

This is a little cheat that offsets the state switch timing for a leg by a given amount. This allows sequential legs like the centipede guy, and galloping quadrupeds and whatnot. Setting it up can be a little tricky and depends on the creature design and the walk style you’re looking for.

Here is the .blend file with script with the timeroffset feature added. The example files for the previous tutorials do not have it. This file also contains all the creatures used in the demo video.

The timeroffset property is added to each leg bone in the armature when the first part of the setup is run (setupSEQ = 0). When the game objects are created the legs will use this bone property offset when that state is triggered.

That’s about it for the offset. It was a lot cheaper than breaking into more than two states and it works pretty well considering how simple it was to implement.

 

IV. BIG WARNING – BUGS and ERRORS and CRASHES! OH MY!

Creating the pose-goal armature in the setup (setupSEQ = 2) turned out to be a total fiasco. I had to con bpy.ops into making some bones for me and every once in awhile it just refuses. It’s a context thing but I never could figure out why it works sometimes and sometimes doesn’t. So enjoy that. If you have a problem with a rig, try it 3 more times, if it still wont work feel free to contact me, I’ll try to help you sort it out.


V. STUFF I HAVEN’T THOUGHT OF YET

I may add notes here in the future when I think of them.

Hey, here’s something- you can ‘deadleg’ a creature by turning all the usebone bone properties to False in a leg. I guess you could make a limping zombie character or something, though you’ll have to compensate the stiff/damp on the other leg so it’s strong enough to carry the weight and the poses will have to be offset so he stays sort of balanced.

Notes – Once you start playing with pose-goals you may notice that there is something horribly wrong with the joint torques. An extreme pose can lift the character off the ground and fly (not in a cool superman way, more like falling up). Clearly this is not physics. Working on that.

Also the feet didn’t stick to the ground like I’d expect so i added a hack where the obj.setLinVel.x.y = 0 on the last object in the chain of the stance leg, which ends up being the toe if you have one. So even in flight forward momentum is lost to this bs-friction-hack. Later I might try adding material friction to the feet or something.

In the future I’d really like to improve the whole implementation in Blender but I think the key will be porting it to the animation system once we get rigid body dynamics in the physics system. It would be great to have these rigs interact more cleanly with other dynamics. I love the game engine but the ‘Record Animation’ thing makes this a really weird workflow, glad it’s there though.

Why No UI? – mostly because the whole setup script is kind of sketchy as it is, it randomly crashes if  you keep remaking ragdolls from the same armature for some reason. Plus these are program driven animations so I think if you’re going to use this for anything you’re going to have to get into the code eventually anyway.

Personal Request – If you make a cool critter with this please show me! A render is worth a thousand comments. Hope you enjoy!

Jun 182011
 

To aid my never-ending quest to remember the things I learn about programming – I’ve decided to make this post as a place to leave little functions and crap that I always forget and are way too stupid to have to figure out more than once.

It’ll grow, I’m constantly learning things I forgot.

######################################
################ WARNING: ############
############## STUPID CODE ###########
######################################
#### PLEASE DO NOT LEARN ANYTHING ####
######################################

def sbh_within(x,y,d):
###---CHECK IF x-d <= y <= x+d
 if x-d <= y and x + d >= y:
 return True
 else: return False

def sbh_center2d(Ax, Ay, Bx, By):
###---GET CENTER BETWEEN TWO 2D POINTS
    Cx = Ax + (Bx - Ax)/2
    Cy = Ay + (By - Ay)/2
    return Cx, Cy

def sbh_dist2d(Ax, Ay, Bx, By):
###---GET DISTANCE BETWEEN TWO 2D POINTS
    Cx = Bx - Ax
    Cy = By - Ay
    return Cx, Cy

def sbh_hypot(a, b):
    c = sqrt(a*a+b*b)
    return c

def sbh_getAngle2d(A, B):
###---RETURNS HEADING FROM A TO B
    opp = B.y - A.y
    adj = B.x - A.x
    hyp = hypot(opp, adj)
    s = asin(opp/hyp)
    c = acos(adj/hyp)
    if B.x >= A.x and B.y >= A.y: r = s-pi/2
    if B.x <= A.x and B.y >= A.y: r = c-pi/2
    if B.x <= A.x and B.y <= A.y: r = (pi/2)-s
    if B.x >= A.x and B.y <= A.y: r = -(pi/2)-c
    return r

def sbh_si(num):
###---RETURNS SIGN OF A NUMBER
    if num>=0: return 1
    if num<0: return -1

def sbh_wrapAng(x):
###---WRAP EULER ANGLES (-pi <= a <= pi)
    p360 = pi*2
    f = x - (p360)*floor((x+pi)/p360)
    return f

—OTHER BLENDER PYTHON API NOTES—
Most of these are issues I’ve run into with and without solutions.

– Can not add game logic bricks via. python API. [SOLVED]
– Can not access poseLib data with bpy.data, only bpy.ops functions and none to read pose data.
– Voxel data files: .raw/.bvox – can’t use multi-frame data. Only 64x64x64 grid size works.

– active object – accessible with “bpy.context.scene.objects.active = ob”

 

May 272011
 

Trying to learn more about making controllers for actual robots, but the bge is cheaper than buying components. Thought I’d start simple so I made a little differential drive robot and some control stuff. Wanted to spice it up so I gave him a little scoop so he could pitch a ball up.

For now the drive control is completely retarded and the target (ball) sensor is arbitrary so it doesn’t mimic a real sensor like I meant it to. Working on the motor controls more right now.

.blend file (b2.57b)

The bot gets the ball, then goes to the ‘goal’, then pitches the ball up. No aiming yet, doesn’t even work real hard to point in the right direction. Sort of a spaz shot.

Actually this was an offshoot of some problems I was having while trying to make a 0G ‘satellite’ type bot. So I thought I’d limit myself to 2D for now.

May 102011
 

I don’t even know…

Okay, I guess I do know, I made it.
Why?
I don’t even know.
How?
bge and a simple gravity script of course.
wanna hear it? here is go!

import bge
from mathutils import Vector
co = bge.logic.getCurrentController()
scene = bge.logic.getCurrentScene()

obList = []
for o in scene.objects:
    if o.__class__ == bge.types.KX_GameObject:
        obList.append(o)

def calcGrav(obA, obB, G):
    m1 = obA.mass; m2 = obB.mass
    m = m1*m2
    loc1 = obA.worldPosition
    loc2 = obB.worldPosition
    v = loc1 - loc2
    r = v.length
    F = G * ( (m) / (r*r) )
    return -v * F

def loopGrav(obs, G):
    for obA in obs:
        fV = Vector((0,0,0))
        for obB in obs:
            if obA != obB:
                fV += calcGrav(obA, obB, G)
        obA.applyForce(fV, False)

loopGrav(obList, 1)

keyb = bge.logic.keyboard
if keyb.events[bge.events.ZKEY]>0:
    for ob in obList:
        ob.applyTorque(Vector((0,0,50)), False)

if keyb.events[bge.events.XKEY]>0:
    loopGrav(obList, -2.5)

if keyb.events[bge.events.CKEY]>0:
    loopGrav(obList, 15)


HOW TO:
– copy/paste script into blender text editor.
– setup an EMPTY object with logic nodes as shown.
– change ‘Engine’ from ‘Blender Render’ to ‘Blender Game’
– set World>Physics>Gravity to 0.0 (under Bullet)
– create some objects.
– hit ‘p’

This script will (should/might) make all meshes in the scene obey Newton.

Apr 132011
 

Blender.org Tracker –  BA Thread –  BlenderNation Article

UPDATE – 01.05.13 – v0.2.7
FIXES:
-Fixed the issue that prevented the addon from loading in Blender 2.63+. Also fixed a problem with the makeMeshCubes function.

UPDATE – 02.14.13 – v0.2.6
ISSUES:
-An API change has made the addon not load. It has something to do bpy.scene.context. I’ll look into it when I have a chance. Fortunately the script will still run as a regular script. If you want to use it, just use the .blend file below or just load it in the text editor and hit run.

UPDATE – 06.20.11 – v0.2.6
FIXES:
-Moderate speedup.
-Cubes output scale/loc corrected.
-Tooltips added.
NEW FEATURES:
-Container insulators: Arbitrary mesh shapes can be used as insulator/cloud objects. Still quite imperfect though. May slow down generation. Best for simple containers; bowl, cup, bottle. A spiral pipe would not work well. Must have rot=0, scale=1, and origin set to geometry.
-Mesh origin objects:. If the origin obj is a mesh, vert locations will be used as initial charges. However this will disable multi-mesh output. May slow down generation.

UPDATE – 05.08.11 – v.0.2.5
NEW STUFF:
-added ‘single mesh’ output option. use this mesh with build modifier to ‘grow’ lightning in animation.

 

 

This is a partial implementation of the algorithm presented in the paper ‘Fast Simulation of Laplacian Growth’
and some concepts borrowed from Fast Animation of Lightning Using an Adaptive Mesh

It currently uses simplified spherical boundary conditions and calculates potential at candidate growth sites using FSLG-Eqn. 9
To be properly influenced by an environment map of charges and
allow artistic manipulation of growth patterns I will need to
implement FSLG-Eqn. 15, which I don’t fully understand yet.

As compared to the simulation times reported in the paper, there is no comparison. This is not fast. They report 2000 particles in 6 seconds. So far 1000 particles will take a few minutes.

A good chunk of the reason for the slowness is the weighted random choice function. Another big reason is it’s python not c. Probably the biggest reason is that I’m a hack and I barely cobbled this together so it’ll take time to get it optimized.

Anyway I think it’s better than making lightning by hand, or at least might give you a good base mesh to mess with. So it might be useful to someone. I’ll keep working on it.

DIRECTIONS:
you can use the example .blend file or load as an addon.

BLEND FILE:
-Download .blend* file, instructions in file. *right-click, save-as [works with Blender 2.5 – 2.62]
-Download .blend* file, instructions in file. *right-click, save-as [works with Blender 2.63+]

ADDON:
-Download script (object_laplace_lightningv026.rar) [this version of the addon only works with Blender 2.5 – 2.62]
-Download script (object_laplace_lightningv027.zip) [this version of the addon works with Blender 2.63+]
-Uncompress, Place in in Blender ‘addons’ folder:
Blender install folder/2.6x/scripts/addons/
-Enable addon
UI will be in >View3D>Tool Shelf>Laplacian Lightning (object mode)

-Hit ‘generate’ – try w/ defaults

-Play with the settings, try again.

iterations – how many times to run loop (number of particles) grid unit size – size of a ‘cell’ in BU
straighness – user variable to control branchiness/straighness
start charge – origin point
use ground charge plane – hacky method of simulating lightning strike. Terminate loop if lightning hits ‘ground’.
ground Z co – z coordinate of ground plane
ground charge – charge of ground plane
mesh, cube, voxel – visualization outputs
mesh – creates vert/edge mesh from data
cube – creates cube objects from data
voxel – creates a 64x64x64 voxel data file from data outputs to ‘FSLGvoxels.raw’ (experimental)

Hope you enjoy. Send me a link to some renders if you use it! Especially if you get >10,000 particles.

Apr 122011
 

I started on a script to allow real time mocap in Blender via. a socket connection to Brekel Kinect

I got it about 60% working. The problem is bizarre. Legs, torso, and head track fine, but shoulder and elbow rotations are completely out of whack.

I posted asking for help on Blender Artists and didn’t get any responses.

Got side-tracked on some other things but I thought I’d stay up-to-date and post what I have here in case it’s just what you’re looking for, or close enough to get you on the right track. If so- let me know how it goes!

Here’s the .blend file.

Just run Brekel-Kinect (I use v.40) and turn NITE tracking on. Then load this .blend and run the game (‘P’) (make sure GAMEempty BOOL game property is ‘FALSE’ or server will not start)

Mar 312011
 

[FILES UPDATED TO WORK WITH 2.57b]

.blend files (2.57b – r36339) right-click, save-as
Uno/Deuce/Pogo
Rigid Body Walking Machines
Extras

link to BA thread

This is a collection of studies and experiments in rigid body dynamics and pose control using Blenders game engine and python API. All the animation data was generated in real-time simulations (with the exception of the camera which was keyframed and the cloth which was not real-time)

———NON-BLENDER/3D NERD BRIEF———
These animations are essentially of 3d games. The games have gravity, inertia, object collisions, joints connecting objects and ranges for those joints. Each scene was created by setting up a game, and running it several times (sometimes with simple keyboard interaction which pushed or propelled a projectile). The animation data (the rotation and location of each object) was recorded during each game. The most amusing or interesting results were saved, one was selected to be rendered later.
Camera moves were keyframed (traditional/manual 3d animation) for more interesting views. Hours and hours of painful rendering, and voila.

The rectangular biped and monopod characters are seperate programs within each game. Each has the ‘joints’ and ‘muscles’ of a simplified human. The joints are functions that limit the location and rotation of an object (such as a hip) with respect to another object (such as a torso)

The muscles are ‘PD’ functions (proportional-derivative). At each frame the functions return the torque value that must be applied at a given joint to attempt to reach a given ‘pose’. These actually behave more like damped-springs than muscles.

The poses are defined by a ‘finite state machine’. This is a set of inputs and instructions that tell the robot (or a part of the robot) what state, or pose, it should try to be in. For example, the robots current ‘state’ is its left leg is down (‘stance’ state) and its right leg is swinging (‘swing’ state). The robot detects that its swing leg is now ahead of the stance leg and has made contact with the ground. The state machine will now instruct the stance leg to change to the swing state, and vice-versa.

There are other states that determine what to do when the robot is off-balance (center-of-mass outside of center-of feet) and when the legs become crossed. The monopod has other sensors and states that determine if it in at a safe position to flip or if it should just hop (sometimes comes up wrong), and compensates for linear velocity (tries to slow forward momentum after each jump)

The weird glowing red walking machines are another thing entirely.

These are not programs, they are structures ‘built’ in a game that work pretty much like they look like they work. The only ‘animation’ that is done is that their ‘motor’, red glowing thing is rotated by magical hogwarts force. Everything else is a result of that motion. Axles, cranks, rods, levers, etc, do what you’d expect them to do (though with infinite strength and sometimes invisible and/or impossible connections)

The designs are more or less lifted. The hexapod and ‘toothpicks’ are the only ones I didn’t actually look at internal blueprints of something for, and those are clearly not original mechanisms either.

There are a few other tests and experiments that worked their way into this collection but I won’t go into detail about those. If you’re curious- ask. Better yet, look it up. Better still, figure it out yourself.

———FOR THE BLENDER/3D INCLINED———

—POSE CONTROLLERS—
Simple pose controlled ‘robots’ were created in Python after studying the work of Ari Shapiro (DANCE Dynamic Animation and Control Environment) and Philippe Beaudoin (SIMBICON/Cartwheel-3d). Primarily
-Generalized Biped Walking Control. Siggraph 2010
-Composable Controllers. – Siggraph 2001
-Controller Development. – Siggraph 2007

Not surprisingly, most of the brilliant methods put forth in these papers are absent in these programs. These are just ‘steps’ towards gaining a better understanding of dynamic character control concepts.

These robots are finite state machines with proportional derivative controller ‘muscles’. They also respond to feedback provided by center-of-mass, foot contact with ground, swing/stance foot positions relative to center of mass and torso heading. Uno knows the ‘uprightness’ vector between the feet and torso. Deuce can also detect if his feet are crossed, and will attempt to correct by standing on one leg and swinging the front leg outwards. They also have threshold ‘KO velocities’ beyond which they will go limp. This should be from acceleration but this works for now.

These robots run independent copies of their controllers so they can be copied for rudimentary crowd simulation. Their individual ‘heading goal’ is controlled by a ‘game property’ string in the robots ‘torso’ object. The goal can be another objects name, or use the prefix ‘a:’ then an angle (in degrees). Do not change the object names or numeration after copying, the program depends on these.

DEUCE – v.0001a of my ultimate goal of creating a real-time interactive break-dancing/kung-fu-master/soccer-playing/gymnast pose controller. What the heck- let’s throw in piano and chess-playing too. 15 DOF: Torso-XYZ, Hips-XYZ, Knees-X, Ankles-XY. Next version I will probably add 1-DOF toes and change Ankles from Pitch/Roll to Pitch/Yaw. I plan to utilize more of the SIMBICON methods and I am also looking at work on ‘genetic algorithms’ for evolving different controller params.

UNO – Created as a study for balance feedback. Eventually learned to hop, then to flip (sort of). Uno also alters his hip angle to compensate for linear velocity to try to stabilize his momentum after each jump. 9 DOF: Same configuration as Deuce

POGO – First study of proportional derivative controller. Just a collision sensor that triggers an upward force. Uses PD to angle the peg to compensate for linear velocity on the next bounce.

—RIGID BODY CONSTRAINT WALKING MACHINES—
Obviously a ‘dumber’ approach to walking animation, though strangely visually appealing. It started with a goofy thought, and ended with hours and hours of trial-and-error learning how bge handles various constraint configuations. There is a lot more that could be done with this technique. These are just a few setups.

THEO – based on Theo Jansen’s ‘Jansen Mechanism’
GOODWIN – based on W.F. Goodwin’s – ‘Automatic Toy’
NOGOODWIN – based on W.F. Goodwin’s – ‘Horse Toy’.
HEX – a semi-original design derived from observations of several 6-legged robot designs.
LE STECCHINI – based on those ubiquitous wind up walking toys we all know and love.

—OTHER—
MORTIMER/RINGO – Extremely simple self-firing ‘mortar’ objects. Created to test uno/deuce reaction to perturbation. Actually these should serve as a warning to anyone who thinks Algebra isn’t important. I got tired of trying to learn all the stuff I avoided learning in high school so I ended up finding the midpoint between the mortar and target, choosing an arbitrary height and firing on that vector. Amazingly, it’s a functional targeting system but it’s an insult to mathematics.

DRAWN-AND-QUARTERED RAGDOLL – Joint constraints had to be created programmatically (as opposed to using Blender UI constraints) so joints could be removed in the simulation. Keyboard events trigger joint ‘breaks’. I could probably figure out a way to calculate limb tension and trigger breaks based on a threshhold- but the whole thing was kind of a lark anyway.

TAD – Failed attempt to create a stable unpowered ‘passive walker’ based on Tad McGeers work. Has never taken more than three steps. Unpowered passive walkers are pretty sensitive to begin with and I’m not sure if a game engine simulation is enough to create one that works anything like in real-life. I’ll give it another go at some point.

H.M.S. STUPID – Another early test. A board that keeps level by using 4 ‘thrusters’. Each fires only if it is below the thruster on the opposite side.

—FILES—
All the models/rigs/programs in this .blend file are free to use for any non-machines-destroying-humanity related purposes as far as I’m concerned. Though keep in mind, though this work is original, some of the concepts and designs utilized and referenced are not. The Jansen Mechanism for instance; I have no idea what kind of intellectual property that is.

If you do use these for anything I’d appreciate a nod.

Word to your matrix.

teldredge

Mar 042011
 

I got a little done on the OSC implementation for Cartwheel-3D but I probably wont be able to go much further for awhile so I thought I’d post what I had.

In the cartwheel source there is a file
\PythonAppSNMApp.py that runs the main animation loop. I edited it to send OSC messages for each joint. To do this I used the pyOSC module and parked it in the same directory for convenience.
The whole thing runs super slow through the py-interpreter. I tried to compile it with py2exe but ran into some trouble. However, I found that I could cheat and use the provided binaries.
Take the modified SNMApp.pyc (and OSC.pyc) and copy them into the \binlibrary.zip file in the binaries. It just works.

Here’s my modified SNMApp.py – the pyOSC module can be found here

Blender uses Python 3 so I had to use a different OSC module here. I just parked it in the same directory as blender.exe – also for convenience.

and here is a .blend file with my OSC listener code in it. I got it to work with the bge and the animation system, but not cleanly.

If you have any success with this let me know.