Report 1 – Technical Research

Technical: find out if it can work and how. Show that it can work in principle. Find the examples that demonstrate what you are trying to do. Specify what steps and equipment are required.

The design my group has chosen is to have participants create music in some form by means of pads that the participants step or jump on. Regardless of what sounds the pads create, whether it is drum sounds or something else, the technical implementation is not likely to be altered drastically and may be similar to any variety of Musical Stairs out there such as in this video: http://www.youtube.com/watch?v=2lXh2n0aPyw.

It would likely be even easier for us because we wouldn’t be restricted by environment such having to make the whole stair react. Unless our concept changes such that the environment becomes an issue, in which case that would be something we need to think about.

Another example that uses stairs: http://www.youtube.com/watch?v=lb2Sq4oyADI

These examples turn people’s steps into sounds, which is similar to what our concept is about in mechanical terms.

The equipment required for our installation is as follows:

  • A number of sensors to determine when a pad is being pressed. The closest thing I could find to suit this purpose is this http://littlebirdelectronics.com/products/load-sensor-50kg. They sense up to 50kg of weight, so they can be used not only as buttons, but they can determine how intensely a button is pressed.
  • An Arduino board and the various bits included in our kit including the wires, the breadboard and the shift register so we can use lots of sensors.
  • A small platform. Large enough to put the sensors in without having to excavate the ground, but small enough that people will walk over it instead of around it.
  • Long wires
  • A Computer
  • Speakers
  • A big TV or a projector

Our concept differs to these in that we’re going to have a display that shows some sort of visualisation.

The steps to be taken:

  1. Put TV or projector (display) in place.
  2. Connect computer to display.
  3. Put speakers in place.
  4. Connect computer to speakers.
  5. Connect weight sensors to arduino.
  6. Connect arduino to computer.
  7. Place pads on weight sensors.
  8. Put platform in place.
  9. Insert pads into platform.
  10. Run software.
  11. Watch people have fun.

The platform will need to be custom made to our needs, whatever they may be in the future. At its most basic it would be a slightly raised wooden board with holes cut out to fit weight sensors. The weight sensors themselves could be the wood cut out from the main piece.

With all that in place there are many things we can do with the participants input. The current concept is to have them collaboratively play a drum kit.

Advertisements
Posted in Human-Computer Experience Design Studio, Uni | Leave a comment

Class Discussion

Several ideas were discussed, including evoking emotion in people. The emotion discussed was Fear. However the other 5 basic human emotions (which were not discussed) are: Anger, Joy, Sadness, Surprise and Disgust. It would be interesting to come up with something that evokes any number of those emotions.

A decent area of The Rocks to evoke fear would be the narrow passageway that’s wide enough for one person.
Some techniques I thought of to evoke fear are: having a musical staircase where instead of music, the stairs creak and groan and eventually sound like they’re collapsing.

Of course, regular musical stairs would also be cool and even though it has been done, it would still be interesting to see what it takes to do it. At it’s most basic, you could have each key be a separate Arduino that detects when someone steps in front of it and plays a note. This would allow the musical “stairs” to be placed practically anywhere. Additionally, this idea isn’t limited to notes. As noted before you could make the stairs creak, or perhaps play a melody as you walk up (or down) the stairs.

Another idea I thought of is to have people wear a device that makes everything sound like it’s underwater, and claim that this is what The Rocks will sound like in 100 years.

Posted in Human-Computer Experience Design Studio, Uni | Leave a comment

Human Computer Experience Design Studio

This category is for my blog posts for Human Computer Experience Design Studio.

Posted in Human-Computer Experience Design Studio, Uni | Leave a comment

Final Blog Post

I started developing Wars With Friends before the start of the semester. I had the idea to bring together aspects of Words With Friends and Advance Wars in an online turn based strategy game. I started out by drawing wireframes of the user interface in the menus. I tried to predict what menu screens I would need and how the buttons on the screens would link them all together. Once the wireframes were drawn out I began working on the gameplay, confident that the menus were easy to implement. It began with just an array of Booleans that would determine the colour of certain sections of the screen. Then I wrote a class that held the Booleans and an x, y value where it would draw the rectangle in a draw method. After that it was just a matter of adding more variables to the class that would affect the draw method. It was somewhat unsatisfying working with rectangles and colours, so I decided I would draw some primitive sprites for each unit/terrain. This was definitely a good decision as the game suddenly started to actually look like a game. The game started looking promising, so I began to actually implement the menus using a library called APWidgets, which I didn’t have to use for the buttons, but it was necessary for the text inputs, because I didn’t know how to access the android keyboard through processing. I used APWidgets for the buttons too for aesthetic reasons. Once the menus were working I started to implement database functionality. The idea was that each user would have a username, password, email as well as a list of games associated with them. That way when they login, their personal games would be displayed in the menu. That basic functionality was easy enough to implement, and once that was working I linked the buttons to the gameplay state and the basics were complete. It was tough finding a free web hosting service that offered php and database access with custom user permissions (which I needed for security reasons), but I eventually found x10hosting.com. I’ve been using it ever since, but I’m not sure how much longer it will be before they notice I’m breaking the terms of service by not having much of the files on the server be related to the website itself (which I actually also use for my Interaction Design Studio web application).  From then on I more or less added features and fixed bugs.

Implementing the Artificial Intelligence was a real challenge for me. At the start of the course I knew very little about AI. The extent of my AI experience was writing a find shortest path algorithm for ghosts in pacman. This time I had to create an AI that would play a strategy game. My initial approach was to hard code the AI’s strategy. I tried to take everything I knew about playing the game and code the AI to behave that way. This was an enormous undertaking which took several weeks to even get it playable, though it still wasn’t very intelligent. Toward the end of these weeks, I started looking at how Reinforcement Learning could be used to make my AI more Intelligent. I struggled to understand how states, actions, rewards and policies could be used to get my AI to learn how to play. I understood the example Cat and Mouse demonstration that was given, but to take what I understood about it and use that to make a better AI was no easy task. Every time I thought I figured out what my states and actions would be, I couldn’t figure out how policies would work. The sheer number of states and possible actions kept throwing me off.
I tried implementing a different kind of AI: one with a list of tasks and task doers with each task having an assigned priority. I got as far as gathering tasks and generating assignments before I decided it would be best to stop trying to make it learn and just try to make it good to begin with. So I went back and continued working on the original AI.
As much as I would have loved to have a learning AI, I didn’t have the algorithmic or mathematical knowledge to do so, and I didn’t have enough time to learn more mathematics.

Posted in Real Time Multimedia, Uni | 1 Comment

Reinforcement Learning

I haven’t updated this blog in a couple of months, but I have been making some progress. A month or so ago I made an AI which is decent. As I always do when I code, I first implemented the method by which the new code will interact with the old code. That is to say I made it so that the AI outputs it’s moves in such a way so as to be readable by the code that plays back opponent’s moves. I also added a way to select CPU as your opponent. Once I got that working I started working on the actual AI. The AI builds, moves, attacks and captures. But it doesn’t learn… yet!

It was suggested to me to read about Reinforcement Learning, so I did. Found this:
http://webdocs.cs.ualberta.ca/~sutton/book/ebook/node27.html
I read through most of it. Not sure how much I understood. I really doubt I can implement Reinforcement Learning in time for the project due date, but I’ll do what I can since I’m going to continue working on it in the future. If all else fails, I’ll improve on the existing AI method until it’s at least challenging, at best unbeatable.

Posted in Real Time Multimedia, Uni | Leave a comment

Updates

Thursday I gave a presentation about what my Real Time Multimedia project will be. I’ve decided I’m going to write an AI for Wars With Friends.

Originally I was going to have the AI act almost randomly at the beginning, learning with each move it makes. I was told that a genetic algorithm would be to slow, but I’m not sure. It may be too slow if a player has to verse the AI in order for it to learn, but it could work if it used a genetic algorithm while simulating matches against itself.

I’m thinking each unit will have a chance of being built initially based on its price, then increasing and decreasing when dealing or receiving damage or capturing buildings, and other reasons I’ll think of later.

Posted in Real Time Multimedia, Uni | Leave a comment

Thoughts on Particle Systems

I haven’t decided yet whether to make Wars With Friends as close to Advance Wars as possible or not. Regardless I’m going to add weather. Rather than making an animation to use for the weather, it may be possible to use Particle Systems for it instead. On one hand it will give the weather a more life-like appearance. On the other hand the realism might now suit the game. I’ll try it and see if it works.

I can also use Particle Systems to make the vehicles or factories of the game release puffs of smoke. Also sparks from weapons and such.

 

 

On another note, players of Wars With Friends can now choose a random player to go to war with (in case they have no friends (who play the game)).

Posted in Real Time Multimedia, Uni | Leave a comment

Wars With Friends graphics update.

I have updated some of the graphics of Wars With Friends.

The buildings now have a proper image and team colour instead of just looking like squares.

The units also have team colour, but the images are still drawn by me in Paint. Update that later.

Also, I’ve redone the drawing code for units and buildings. Now they are capable of having animation. The infantry unit is so far the only unit with animation (there are three different frames of its legs).

Also fog of war was disabled in the screenshots of the previous posts so I enabled it again.

Posted in Real Time Multimedia, Uni | Leave a comment

Current state of Wars With Friends.

For those who don’t know, I’m writing a turn based strategy game for the Android and calling it Wars With Friends. The game play will be similar to Advance Wars, and the Internet aspect similar to Words With Friends.

Login Screen

Login Screen

Select Game

Select Game Screen

Bean Island (not sure if copyright breached)

In these images I was testing loading and unloading of units into a transport. Two of the blue blocks are transports and two of them are the units that came out of them. Some of the blue blocks are water. Graphics update coming soon…

Left Side of Bean IslandRight Side of Bean Island

Posted in Real Time Multimedia, Uni | Leave a comment

Real Time Multimedia

Worms:

Processing Sketch:

 

Posted in Real Time Multimedia, Uni | Leave a comment