Short exercise 1 – Generative 3D-objects
Write a for-loop that generates 100 individual (size, colour) Objects in 3D-space.Use the functions random() or randomGaussian() and translations.
Georg Nees, 8-ecke, 1960er Jahre.
Create a three-dimensional version of "8-ecke", a classic work of generative graphic by Georg Nees:
- generate 8 points in 3D-space
- connect these points to build a 3D-object (boxes, shapes, PShapes, vertexes,…)
- generate a matrix (14 X 19) of instances of these generative objects
Exercise 2 – Augmented Reality
Take your generative object from last lesson`s exercise and attach it to a marker to take it from virtual to augmented reality.
Optional (1): Can you write a class for your object that lets you attach generative instances to several markers?
Optional (2): Can you write a sketch that attaches the objects of all students of the course to different markers to curate an augmented reality exhibition?
Short exercise 3 – Android-Devices
(1) Write a sketch that lets you paint structures on your display by using the gestures "double Tap", "pinch" and "rotate".
(2) Enhance your drawing sketch by a function that saves your artworks as an image-file on your phone.
Short exercise 3.1 – Android-Devices
Code your own visual effect for your mobile camera. Make parameters its visual output (e.g. radius of circles, ...) controllable by gestures.
Exercise 2 – Google Cardboard
Create a generative three-dimensional virtual reality environment for cardboard. If you don`t have an android device at hand just do a simulation on desktop, we`ll test it at the lecture (remember aspects of performance).
Please read: JSON-Format
Lecture 3 / 4
Short exercise 3-1
Go to the website https://www.data.gv.at and find out, from wich URL data can be retrieved. Display the feed in Firefox and analyze it. Can you also load it with Processing?
Short exercise: 3-2
Come up with your own mapping for bitcoin data (like price and amount) and generate 3D structures from it.
You may also include random processes.
Short exercise 3-3:
Get your own API-Key for the NY Times API.
Take to code from above and search for a term (you can chose it yourself).
Visualize the amount of hits and publishing date (e.g. per year) for your term from the JSON response. Remember the difference between JSONObject and JSONArray.
Choose an online source for JSON datasets (like the ones in the lecture / NY Times, Bitcoin), load data and design the mapping of the data to visual parameters. Generate visuals or sonifications from the data. Use the steps we discussed in the course: (1) data (retrieval), (2) algorithm, (3) parameter mapping, (4) sensoric output, to come up with an idea.
Build a simple patch in MAX and send Data from Processsing to MAX to make it sound.
Use the skills you learned at the course to develop an interactive (maybe using controllers or sensors) generative installation that produces visuals and/or sound (with processing or MAX).
22. 06. 2017 (Lecture): Briefing, Coaching
07. 07. 2017: Send concept and prototype made in processing. --> Feedback
14. 07. 2017: Deadline. Send your final work and exercises 1-4 to andreas.pirchner ufg.at