A second visit to the DMC by myself, Sonny and David was a prep for our last meeting together as we were all available and local. This was more so we weren’t rushed next week (monday the day before we go live).
In this visit we decided to rig up the finished window, and soften the edges of the frame so that the window looked more authentic. We decided to use tape and box a square around the lens of the projector to create this. We also set up the projectors we thought we’d be using including the idea to change the table setting and the way the spiral faces so that the projection can be shined onto a piece of paper on the ground. This was done with a Quime projector under a table with a hole through some sheets, fitted to the frame of the page and run on a loop.
We also decided to cut out individual words as suggested last week and integrate them apart of the central page display as the words rise less of the poem Sonny extracted is displayed until the remaining word is hell.
The poem is as follows:
Last night i wandered into hell
but i did not find one evil person
just sad human beings searching
desperately for a way out of the dark
This week we basically needed to showcase something and backtrack to a piece that worked in the previous weeks and not get caught up on the idea that one thing, this thing we believed could be the deletion of the sheets, as the feedback suggested it wasn’t strong enough unless it was being used as a medium. I decided to try a different approach, this time a literal approach to try and decipher the actual motivations or the story behind the work. The feedback would send us back thinking about the subjective view of the space, and that we needed to inform the responder more than just setting out the space and asking the responder to come up with their own meaning.
I grabbed a pen, some paper and some objects and started jamming in a diorama setting. I had this idea that we could literally play two projectors across the space at one another: one with an exploration on the treatment of refugees and one with an exploration of what can happen when refugees assimilate. The treatment video would be attached to the ceiling, whilst the assimilation video would be playing from down below and when the audience entered the space, they would interrupt the video, blocking it. Whilst this video wouldn’t be visible, the treatment video or ‘the past’ would continue to play as we cannot change the past. The idea is that both the pastime of war and the current status of those affected can’t be re-written in history (incorporates the letters) but have to be acknowledged moving forward. The content then doesn’t have to be original it would be a remix of all the different implications of conflict on all nations and cultures.
Another element suggested to us was to split into groups within our formed group and come together with works that interacted with one-another. We took our own approach on this with this new creation, and basically assigned members a good, bad and audio role, whereby no one could see the works or hear the overlaying audio and we’d basically jam it all together when everyone was done. This meant the content wouldn’t have a structure nor would the audio match the visual. We didn’t know how this would pan out but we needed to take a risk this week.
My part i played around with the audio of various film trailers, songs and speeches that all had connotations towards loss, loneliness and war effects and mashed it all together. This was to accompany the montage of visuals by the other groups, however I had no idea what their creations would look like. The plan was then to stage it all in the space and see if it worked.
The feedback we received was that potentially the videos could also include information about the family that lived there and that there should be longer pauses between the clips, so the televisions were flicking on and off. We agreed with this and would like to build on it next week.
BE YOUR OWN CAMERA CREW
As a media and communications student in my final year of university I’m interested in the ways collaboration between industry, emerging technology and myself as a creative can co-exist based on more than just one skillset. I started in Digital media because I wanted to innovate the filmmaking space, using drones to create a start to end video project using only drones as the main camera rig, in all sizes and methods, I wanted to use them for not only the aerial birds eye views, but the spaces at eye level for more intimate shots. Upon leaving, we’ve leveraged a blog, sometimes a YouTube channel and written essays toward topics that often get left in a subject dropbox and forgotten about. I believe new technologies such as autonomous robotics are going to help creative content producers like myself gain an exciting edge over the “creating” gap in industry and graduates.
A digital portfolio was my answer to the question of how individuals can “Be (their) own camera crew” and how a series of previous projects and indeed subject specific works can be organised to showcase themselves to industries in filmmaking and creative spaces as having experience making start to end projects. I wanted to incorporate autonomous film devices such as my drone, so that I may be not only the filmmaker, but now the technology enables us to be the subject. YouTube and consequently video as a platform allowed me to address the creative accumulation of content for a portfolio, as well as showcase the potential these devices have to make professional standard works. YouTube was chosen to continue to produce works in an openly featured platform to invite stakeholders as well as constructive comments to improve elements of a work. Using this type of device highlights my interest area as well as the space left by filmmaking theory thus; “Though there is a range of techniques to automatically control drones for a variety of applications, none have considered the problem of producing cinematographic camera motion in real-time for shooting purposes”(Q Galvane J Fleureau F.L. Tariolle P. Guillotel, 2016)
The filming process included a shot list that i’ve created below, that can all be done by myself whilst riding the board or not, the drone can be programmed to actively track a subject. This is done by clicking and dragging a box on the live video on a smartphone whilst the drone is in the air. This further amplifies the potential of creating a work that I believe can be taken to a major production company to showcase a skillset.
Upon talks with my tutor, the way i’ve presented this is in a 3 part video series titled: “Be YOUR own camera crew” on YouTube. I detailed that this would allow an accumulation of different angles of what can be showcased with a single recording device, an internet connection and a great idea. Some of the anxieties about these emerging technologies, I also wanted to lay to rest, by sparking discussion and address a tool that’s being heavily regulated at the moment. “Despite all of the new tools, drones are still only used in about 10 percent of film productions where a camera drone and crew can cost less than $3,000 compared to $25,000 for a helicopter shoot.” (A Marken, 2017) I believe will change the way not only filmmaking is approached, but also journalism and surveillance.
A Dalton, 2016, This Sci-fi Film was shot entirely by Autonomous Drones, Engadget, Blogpost, viewed 1st June 2017, <https://www.engadget.com/2016/09/27/in-the-robot-skies-sci-fi-film-shot-autonomous-drones/>
A Marken, 2017, Visable Flight: Drones Raise Filmmaking Opportunities, Robotics Tomorrow, webpage, viewed 1st June 2017, <http://www.roboticstomorrow.com/article/2017/05/visible-flight-drones-raise-filmmaking-opportunities/10110>
Chris Moore, 2016, Cybercultures Week Two 2016 (w.2), Prezi lecture, DIGC335, University of Wollongong, 8th March 2016, viewed 28th May 2017, via <https://prezi.com/poqmln3hslyh/cyberculture-and-cybernetics>
Civil Aviation Safety Authority, 2017, Remotely Piloted Aircraft Systems: Can I Fly Here?, Australian Government, viewed 30th May 2017, <https://casa.dronecomplier.com/external>
C Rollins, 2017, Mavic Pro – Active Track on a boosted board, May 26th, YouTube, Online Video, viewed 30th May 2017, <https://www.youtube.com/watch?v=1mXo6yz4cv4>
L Young, 2016, In the Robot Skies, Vimeo, online video, viewed 1st June 2017, <https://vimeo.com/184429206>
Q Galvane J Fleureau F.L. Tariolle P. Guillotel, 2016, Automated Cinematography with unmanned aerial vehicles, WICED ’16 Proceedings of the Eurographics Workshop on Intelligent Cinematography and Editing, Portugal May 9th, p.p. 23-30, Eurographics Association Switzerland, viewed 29th May 2017, <http://dl.acm.org/citation.cfm?id=3056987&preflayout=tabs>
Robotic implications and emerging technologies have always been a fascination of mine, and my university career has lead me to want to dive into these potentials usually by obtaining a new device or creating an account on a platform and just using them. Over the years at the University of Wollongong, I’ve created some digital artefacts, media art and written countless blog posts that have attributed to a creative content portfolio that will serve as a resume moving into the digital generation, lead by us in communications and digital media. This aggregated content curation has lead me to drone technology and how this emerging technology will be something this degree and certainly its graduates will have to become aware of as a tool to capture aerial images and video. I’ve been focused on the devices for about 2 years now, ranging from theoretical research and skillsets within ethical privacy, e-waste, production and consumption in the Asian Pacific, commercial and non-commercial use, agricultural implications, aesthetic filming and editing as well as launching a start-up within iAccelerate fuelled by UOW pitch 2016 whereby I’ve designed, built and implemented a drone device in collaboration with the RMS.
What I intend to do for this kind of research is something perhaps down the aesthetic road, whilst still defining my expertise and encouraging others to do so. I was to understand and field test the options that “off-the-shelf” commercial drones offer filmmakers and creatives, essentially out of the box ready to fly. Collision avoidance, active tracking and smart landing features are all components that drone manufacturers have to have as a core to their product if they’re to compete in the market today. These devices are a flying personal camera crew, and I want to create a video that captures myself as the subject that’s filmed by me. The edit is then done by me and my skills in two fields are already being tested and improved. That the act of creating and learning from failure still have more physical work showcase potential than ever before, we now have an evolutionary showreel to showcase what we’d describe in a written resume’ as “flexible, diverse and hard working”. Inevitably, this content creation goes back into my portfolio online and hopefully as media professionals scroll through the years, areas of improvement are evident.
For this digital artefact titled, “Be your Own Camera Crew”, I want to create a series of, or one video, to capture the skills I’ve learnt over the years of university attendance and the skillset from that, and create a visual showcase of this device potential for content creators like us who are trying to create jobs that perhaps aren’t even open for positions yet. The processes of editing, planning shots and ultimately creating a start to finish product that can then be used for a workplace portfolio is something i’d be interested in exploring. The drone I will be using is a product made by DJI, one of the world’s leading manufacturers in quadcopter technology, utilizing what’s called “active track”. This feature allows its user to click and drag a virtual box around a subject on live view from the drone on a smartphone, to then become the focus point of the shot. Once the subject is recognised and focused on, piloting the drone becomes automatic, without the use of the controller or even any piloting gestures. This means the person creating the work doesn’t have to worry about missing a particular moment in the filming process as the drone stays fixed on them. The type of video I would like to create would be something visually aesthetic as well as something with motion. The editing is then done with free software that comes with both personal laptops, in my case iMovie, or at the University. Lastly, the platform YouTube, allows this to be publically available for anyone with an internet upload connection. This will allow feedback for critic, as well as the convenience of a link to embed in future career initiation talks and applications.
I will be tangling with the week 5 topic of “The Object” and the autonomous function found on commercial quadcopters today. With some research, I’ve discovered that I’m not alone in my thinking that drones could serve a real world place in terms of providing accessibility to social, educational and creative disciplines, as theorist Utkarsh Mittal conveniently states that drones will provide new opportunities for content creation and research, users may expect drones to be part of the technology resources available. I use this example broadly, but some specific examples that I’ve been exposed to are the library at the University of Wollongong. The ‘maker-space’ will encourage emerging technologies and new devices on offer for students to complete projects that require an extra layer of physicality or aesthetic. These include drones, 3D printers and virtual reality, all available within a University library, which means the integration of these devices are already being encouraged as academic resources.
In my project, I want to contribute to this discussion from the viewpoint of a creative content creator.The unmanned aerial vehicles (UAVs) are linked to all kinds of prejudice and harsh criticisms about privacy, hostility and the unknown realm that is autonomous robotics. My intention is to explore the latter of these categories, and unlock what it means to aspiring media professionals like myself, when devices we use to conduct research and perform our creative tasks suddenly are able to do this without piloting and without supervision.
“Examines the place and impact of new digital manufacturing technologies – 3D printing in its various forms, CNC machining, Laser cutting, and digital knitting and weaving”
Upon visiting the Powerhouse museum and the showcase that was “Materialising the Digital”, perhaps the greatest observation was that the iteration process and the value in producing a physical artwork, installation or interactive technology is perhaps an solely aesthetic purpose for some of these manufacturing technology. As we know, the act or potentials of some these practises don’t have the traction needed to be an everyday device like our smart phones or laptops, however 3D printing materials that we can emulate to showcase or surroundings, natural occurrences and bring to life, I believe is what Matthew Gardiner has captured perfectly with his art work Oribotics: The Future Unfolds.
Through the entrance of the gallery space at the Powerhouse museum toward ‘Out Of Hand’, Gardiner’s work is in the first room with its luminous pastel greeting of what initially looks like flickering LED bulbs. As encouraged, interaction is key but touch is disallowed, so naturally the audience including myself wanted to get as close as possible the the works to see which had the ability to change, or, that we could manipulate to accommodate each individual experience. ‘Oribotics’ again, maintained my full attention throughout the exhibit due to this personal reason. Every Time I interacted with the work it was based on my movements and my motivations and curiosity towards it. On first viewing and interaction, the robotics involved with the processes of opening and closing, had a blossoming affect. The origami design perhaps helped with this aesthetic and drew connotations to cultural representations of Japanese flora and art. As i drew closer and realised the proximal movement of the work, and began the interaction of judging the implications of standing closer and progressed to using my hands to make them expand and retract. This was all before researching the works, this was purely just my initial thoughts and curiosities. The display was across a rippling wall with each Oribot displaying a different colour light behind it depending on how close any interference was to its sensory trigger. I worked out that the closer the subject to the device, the warmer the colour (red, orange) and subsequently the further one pulled away the cooler (blue, green). The materials came across to me as a web like surrounding around a series of wires that are central to a mechanism that expands and retracts. My interactions with the device were captured at the time that can further explain this idea, thus;
Throughout the exhibit the devices, were embracing a self directed approach while ever there was no interaction. This created an eerie nature to the room it was in due to the colourful contrast. I wanted to know why some movements I did, some being very similar, caused different levels of intensity to which the objects changed. Some of the devices had a more sensitive response to my hand than others, and some didn’t react at all to my hand but then would act autonomously without being provoked by an outside factor.
Matthew Gardiner is an interesting practitioner, in the way that he has so much to do with the values upheld by the University’s digital media encouragement and the way we should approach some of the projects we’re asked to create in a short amount of time. Instead of thinking about how we can use a device or technology to create something aesthetic, he grabs an idea or process that is already complicated and not associated with media art and technology, and uses his own skill-set set and research practise to try and represent this with a technology. Reverse engineering something that’s perhaps static in its practise but can take another simplistic form. This really resonated with my curiosity and perhaps my own creative endeavours in a lot of the technologies I’m personally involved with i’ve had to reverse engineer a practise for them within a creative space. Gardiner is a cross cultural interactor, meaning, he regards his works to be influenced by Japanese traditional (origami = oribotics, blossoming plants), Western ideas of technology (3D printing, sensory technology, LED lights) and the conversations between the two repeatedly as well as the response from the audience. Then he’s also an accomplished designer, working with material science and perfecting a fabric, experimentation and exploration of new ready technology and computer science so that the two can create a work (M, Gardiner 2010, vimeo). His background in a digitally dominated field allows him to exercise these kind of aesthetics within a work but explore deeper meaning associated to biological factors. This immediately got me interested in the way this artist thinks about the world and how process would be an interesting exploration for him when designing this particular piece.
Image: Jayne Ion, Facebook 2016
The work debuted at the Arts Electronica Festival in 2010, and introduced me to an interesting locational narrative to how the work came to life and dives deeper into the meaning behind materials and how the layers of research ultimately create such a successful work. The installation is situated in the FutureLab, directly above the BioLab and the FabLab. These two interact in the same way he considers himself cross-disciplinary.
The BioLab is a space whereby visitors to the centre can be showcased to methods whereby plants are synthetically cloned, it’s also symbolically situated in close proximity to the 3D printer that was used for the materials in the Oribotics. It’s interested in the ways biology interacts and the process connected to life and how artists in residence can represent this in their works, offering the patterns within a lifeform. The FabLab, is a space located opposite, and looks at the ways we can manipulate materials and use new technologies. Things like laser cutting and, for this work, 3D printers are focused on not for their aesthetic presence, but they’re ability to produce the materials within a work. The focus isn’t the process of the print, it’s the ability to be able to tinker with the materials of the work produced. The plastic use for Oribotics: The Future Unfolds, we made from the FabLab, as well as the corresponding folds, manipulations and designs for the devices. The constant dialogue then from the origins of this work, includes the symbolic representations of the microscopic folds in the material used and thus “highlights the connection to the many contexts where folding occurs in nature, the most significant being the folding of proteins, including DNA” (Arts Electronic, 2010). Where these actions occur in nature and in the work, we can understand that even the slightest of mistakes could have rippling effects on the subject, it’s why Matthew Gardiner has given in-depth thought to the materials he’s chosen through understanding and trial and error.
His account of the materials for the Oribotics is best explain by him, where he talks about paper fibres were unsuccessful, due to the material fibres breaking when folded repeatedly. It was revealed at a 900x magnification that this was then going to affect the structural memory and integrity of the device. What his research discovered was that plastic polyester, as mentioned produced in the FabLab, had fibres within the material, that even though don’t break, still allow bending of the materials allowing a fold to be remembered within the device. The polyester therefore was deemed a stronger material to be used for the origami shapes as their entire aesthetic is the way they seamlessly fall back into shape every time. This artwork could then be produced with multiple iterations, with greater durability as well as keeping its structural memory that could then be programmed to perform the blossoming aesthetic.
“discovering patterns that have complex expressions that can be repeatedly actuated” (ARS Electronica, 2010)
Image: Jayne Ion, Facebook 2016
This process was perhaps an indication for fascination personally, due to this lifeless, meaningless and somewhat inanimate process of 3D printing, micro fabrics and robotics, being digitalised to assume life or resonate representations of life and purpose. Within a workshopping exercise in a gallery space at the digital media centre, an area of practise we were trying to discuss in terms of brainstorming potential project explorations. Subconsciously, this work resonated with me for that reason. I’m interested in the way Matthew has managed to duplicate a natural occurrence such as DNA protein folds and the precision involved, yet somehow related it back to his field practise that, initially if spoken about in the same sentence, wouldn’t have had much alignment. The idea of cross disciplines utilising skills and knowledge to create a media artwork that allows some interaction is something I believe amplifies this work.
The interaction process was something I tangled with, and also not perfected in some areas within the space when I visited, the overarching theme was clear. The proximity sensors awaited human presence, back lit with an LED light, so that the above research and groundwork could be showcased as an artwork. The closer the subject got to the oribotics, to more the robot would expand, like a blossoming flower. The colour would also correspond the movements. This simple interaction had yet another layer of complexity, with the folds in the movements of the oribotics reaching 1050 in a single contraction. The idea then in these movements, the oribotics would assume an autonomous trigger point to create a ripple effect of opening and closing for an image across interactions along the wall.
It’s interesting to see this work as technology reflective of life on earth and processes that happen in great detail and often go unnoticed. Perhaps this work explores the possible future, of where life on earth is headed. Through the use of various robotic technology was are almost able to model a DNA system with materials that are artificially created as layers in space. I particularly like the potential these Oribots have to steer the negative narratives away from robotics as a general socially constructed moral panic. These small little flower like devices have a living organism feel to them, and could pave the way into how we think of life with robotics as apart of our everyday life to interact with in daily life. Moving away from the stereotype of ‘robots taking over the world’, these Oribotics are dependant organisms, for those who don’t like the idea of rushing into autonomy, these have some basic autonomous controls, however are programmed by us. Instead of creating devices and robots modelled on the human species and making them look as close to us as possible, Matthew Gardiner has developed the flora equivalent. I believe this could invite a positive attitude from those whom believe artificial intelligence has negative impacts on society. If we created robotics with the attention to detail that Matthew Gardiner showcases with ‘The Future Unfolds”, perhaps the plantation thats been irreversibly damaged could be focused on and instead of remaining bare, we could replace with these ideas of blossoming devices. Perhaps deforestation effects could be replaced by larger scaled oribots that encompass the nature of the plantation affected. Tall blossoming trees, that whilst don’t offer the natural purification elements, would look aesthetically more appealing.
M Gardiner, 2010, Oribotics [the future unfolds], Vimeo, online video, Novemeber 2nd, viewed 26th April 2017, <https://vimeo.com/16429167>
Orobotics, Matthew Gardiner, Oribotics.net, viewed 27th April 2017, <http://www.oribotics.net/>
Arts Electronica, 2010, Artist in Residence: Matthew Gardiner – The Future Unfolds, Repair, 2.9. – 7.9., viewed 27th April 2017, <https://www.aec.at/repair/2010/07/15/artist-in-residence-matthew-gardiner-the-future-unfolds/>
Museum of Applied Arts and Sciences, 2017, Out of Hand: Materialising the Digital, MAAS, viewed 28th April 2017, <https://maas.museum/event/out-of-hand-materialising-the-digital/>
M Gardiner, 2012, The functional aesthetic of folding, self similar interactions, ResearchGate, viewed 30th April 2017, <https://www.researchgate.net/figure/221308482_fig1_Figure-1-Oribotics-the-future-unfolds-installation-in-Melbourne-Australia-2010>
When thinking of a project we’re always looking for the perfect “aesthetic”, and myself included, find this the hardest part of finding an intrinsic motivation to see a project out. This semester we’ve been challenged to reverse engineer our own thinking and take into account where we think our practise could take us. What is my field we were tasked to question, and as a continuum we have been preparing a project pitch if we were to create a work based on our expertise, skill-set and research throughout our career. I’ve always had a passion for creating projects and am always looking to carry over curiosity into further classes and trying my luck as a career, this task is no different.
Inanimate to life
This is something that resonates really closely with me and something I have a very unique passion for. I am really interested in the way humans respond and act towards inanimate objects and more specifically technology that emerges through rapid prototyping and seeing the results through trial and error. The idea of Inanimate to Life stroked me during a workshopping exercise at the innovation campus. I want to create the aesthetic I was completely blown away with on the excursion to the Powerhouse Museum of the timeline type set up of the old Mac and Apple products and prototypes. Without realising I’ve accumulated a lot of old tech that has either broken or been given an upgrade by a newer device. These range from a phone, tablet, laptop to video cameras and drones to small simple robotics. The physical arrangement of these devices and showcasing what little use they have left I believe would create a dialogue between the works they’re capable of producing.
I would like to have perhaps a series of working video cameras, videoing the cameras that perhaps don’t work anymore. I’d love to have a drone that works perhaps carrying another via a piece of string, and then another tied to a wall that just hovers and also becomes part of the installation that doesn’t move, however not by the autonomy of the device, gps assistance or even pilot control, but by its constraint to a piece of string. Perhaps these devices serve a different purpose now and it’s how they work in sync with one another is how the aesthetic is created. I have the vision of a confetti crazy space designated to this technology on their last limbs of life. Battery running out or malfunction would resonate the sacrifices made to connect to the user and get the particular content.
Interesting tangent on how when left alone with full autonomy, these devices interact with one another. This project proposal is inspired by the works of Cirque du Soleil, ETH Zurich, and Verity Studios that showcases the device gaining a life like quality of curiosity and automation through the use of movement and sound to create a visual aesthetic that we disassociate with the technology of quadcopters.
short film featuring 10 quadcopters in a flying dance performance. The collaboration resulted in a unique, interactive choreography where humans and drones move in sync. Precise computer control allows for a large performance and movement vocabulary of the quadcopters
I’d like to explore the life-like qualities we give these devices and other devices to interact, so that the responder and audiences feel like that when they step into the installation, their interactive expectations are shut down as I want them to step into a room of crazy interactions and dialogues between the technologies. The power in the way they work/don’t work or assist one another is the experimental art work I hope is achieved and creates something with visual interest. While ever there is devices that are slowly beginning to stop working, have broken already and are serving as a type of graveyard or shell prop, mixed with those fully functional and aiding the decent of the inevitable of the others I believe could be an interesting exploration in what we expect and the pressures we exert onto devices we consume regularly in media arts.
If you were to ask me what my plans are for work after university or what kind of role I’ll play for a particular company or organisation until I get my big break, I’d hate to come off rude or naive, instead trade it for ambitious and unknown. A reoccurring theme perhaps with my direct is the unknown. This isn’t due to lack of research or lack of practice and me throwing the towel in with no real ideas on what I want to do, it’s the fact that what I want to do or aspire to do doesn’t exist yet. This particular mindset I can thank those whom I consider a mentor and friend in Ted Mitew and Chris Moore from the university of Wollongong. That and my introduction to Casey Neistat, where the idea of an idea is now dead and unless you jump into something you love and just “do it” or make something that shows you’re keen it will never grow into something and perhaps you’ll be stuck looking for a set of instructions on how to make it with the training we get in media arts/digital media.
To give some context, I’ve been working on a project for the RMS where we are aiming to reduce time taken, safety and costs in surveying and inspecting a bridge using drone devices and 360 degree video technology. This lead to the build of a custom drone that we are prototyping into a MVP (minimum viable product) that is ready for field testing. This stemmed from projects I completed throughout university and developed a passion for drone technology and their commercial and aesthetic application.
At the moment i’m learning as much as I can at iAccelerate, a business incubator for local businesses giving them a learning environment, a physical space and a wealth of support that allows people like myself that love to rapid prototype and test things as soon as possible, to keep creating new content and getting in touch with people that can assist in entrepreneurial endeavours.
This work is one I created last year as a showcase (very quick very unedited) to a glimpse of the potential the device can be used to inspect a bridge. This was uploaded to YouTube on the lowest settings to ensure fluid movement around the screen. This was done so that I could then quickly link this video to those at the Roads and Maritime Services from here in Wollongong, working with a team from Parkes, NSW. The software that runs the footage, allows 4K images, as well as a zoom option that would obviously be used in real scenarios, this just showcased and allowed feedback on stability, and the true potential of 360 degree film.
Perhaps something I created less as a working portfolio and more as an exploration of just testing what else this device could do was filming my friend Sonny riding a skateboard around a carpark. This kind of got me thinking how i’d target this device perhaps for a greater or wider audience. All this was me trying to emerge this practise as a career. How could I weave the device used for infrastructure into an aesthetic, and I decided to market the footage giving the audience or viewer of the content freedom to click and drag to view whatever section of the 360 video they wanted.
I achieved this by creating a mount for the bottom of the drone, that was basically used for anything below the horizontal access, but could easily and quickly be then switched to the top too. If I was going to give the product a go in a market thats quite contested I had to stand out with both uses with it and practicality.
Lastly this device went for a pure aesthetic location with the aerial altitude tested and the camera views being the subject.
These works along with a collection of files on the software that stitches these videos to the product we see here are some things I like to keep on record for when I start to really give this a shot. At the moment and for the last year i’ve taken my research to the physical, field testing and asking as much as I can from the engineers that will hopefully be using this device. I have said yes to countless free labour offers and learning opportunities and only ever request that I can use them in a professional collection afterwards. I believe this, along with working on this majority solo, has resulted and is continuing to result in me knowing the product very thoroughly and the market need.
The three target organisations that I have in my sights would be
- Firstly, the RMS: These people funded and took me under a project with the LookUp. This device will be first and foremost for them and their work with infrastructure
- Secondly, creatives: What can YOU do with this. The ability to look up or down from an aerial device in 360 degree HD footage. What can a creative do with this kind of freedom and the ability to do both at the same time. The footage can be viewed in VR goggles and product a live feed. The commercial market can decide
- Thirdly, hobbyists and tinkerers: the people I assume want to get into drones and building their own modifications. This device could be something that people take apart and re-assemble to fit their individual needs or trial some of their own ideas.
I think where I need to direct myself, also resonates with what our Guest Lecturer Paul Jones said about keeping a positive work ethic, keep making works that resonate the line of work you want to be in and then talk to the people you need to whether it be lecturers, mentors or potential employers and show them that you’re dedicated to your craft.