Feedback from students and tutors was unanimous to the need for more structures rather than one oversized ‘Jellyfish’, the sculpture Zoe and I produced should look more like a jellyfish if we want it to be credible.
As the interactive sculpture Zoe and I had tested in the art studio was not practical, given it would require high voltage powered fans to be suspended from the lighting bars of the gallery (PIC UPLOAD). Retrospectively, relying on lighting bars is not practical as the work will need to be set up on both the Innovation Campus (UOW), backed down and then set up on the Main Campus. The sculpture needs to be easily backed and moved.
I am traveling to main campus today with the class to view sites where we might install the work on the main campus (UOW)
Went to get a motor. Figuring out how to actually understand how the hardware of the Arduino and the 5v-Stepper Motor I just bought and the Arduino software communicate is a time consuming task.
Wrapping my head around the Capacitive touch sensor with the Innovation Campus Technical Officer (T.O.), Glenn, was simple enough when the byproduct of Deniz Balabay (collaborator on previous, Sensory Cocoon,work) and Glenn’s efforts left only the coding adjusting a couple of lines of code and a relatively straightforward hardware setup. This time around seems a bit more complicated. I research YouTube for tutorials and manage to run the motor but cannot make the speed vary.
After tinkering with no resolve the thought of ‘how am I going to trigger this motor in the first place?’ Sprang into my head.
The sensors I had thought of using to trigger the motors to activate pulling the Jellyfish up into the ceiling.:
- A weight sensor?
- A button trigger?
- On the floor could be a series of ‘knock’ sensors, or conductive weight sensors, acting as pressure pads.
- I motion sensor- It would be too dark to use a light detecting sensor.
- An ultrasonic could be installed at ‘knee height’ to discretely detect people when they approach, triggering the stepper motors.
More and Small
Today Mark Richards (former student) visited and we sat down to talk about this work. His advice was for Zoe and I to scale down our project. It would be a stress relief if we decided to go from 1 massive sculpture to 15 smaller ones . Mark pointed out that the upcoming prototype is due in only 2 weeks, so it’d best to perfect one small one fro the prototype and fourteen others can wait until later.
Ultrasonic sensors (ULTS)
When reevaluating, I found that if I put four ULTS at knee height to the audience these sensors would detect only the closest object/person to them and would not detect multiple objects/people. There is a way to make ULTS’s work using more detailed, complex, coding and having the ULTS mounted on a small motor, but this ‘radar’ setup is slow in assessing its environment and if bumped will need to be re-positioned and reset. Looking back at the previous work with Nothing To See Here (University of Wollongong Exhibition) to the successful results given by using a touch capacitive sensor.
The Arduino Mega board could have enough digital inputs on it to potentially operate 12 5v stepper motors (4 leads per motor x 12 motors) and I could use the touch capacity sensor on each ‘Jellyfish’ using the Mega’s analogue inputs. Alternatively, and using less analogue ports, I could have each motor move at different speeds in relation to each other but moving together in groups triggered by; 1 Touch sensor Jellyfish and 3 non-touch sensored Jellyfish grouped. So 12 Jellyfish in 3 groups of 4, with one ‘sensored’ in each group.
Documentation of experimentation. Excerpt sketches from art journal;