<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[bHaptic Project Database]]></title><description><![CDATA[Immersive, Interactive, and Emergent Media]]></description><link>http://b.bhaptic.net/</link><generator>Ghost 2.6</generator><lastBuildDate>Fri, 01 May 2026 20:21:12 GMT</lastBuildDate><atom:link href="http://b.bhaptic.net/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Jinn-Ginnaye:Palmeira, 2024-pres]]></title><description><![CDATA[<p>Shown at the Norfolk and Norwich Festival, May 2025</p><figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/1122828226?app_id=122963" width="426" height="240" frameborder="0" allow="autoplay; fullscreen; picture-in-picture; clipboard-write; encrypted-media; web-share" referrerpolicy="strict-origin-when-cross-origin" title="Ginnaye_NN_2025"></iframe><figcaption>Jinn-Ginnaye performance at Norfolk and Norwich Festival, May 2025</figcaption></figure>]]></description><link>http://b.bhaptic.net/nn_ginnaye/</link><guid isPermaLink="false">68da748766c1b50d5f890b39</guid><dc:creator><![CDATA[Kirk Woolford]]></dc:creator><pubDate>Mon, 29 Sep 2025 13:02:17 GMT</pubDate><content:encoded><![CDATA[<p>Shown at the Norfolk and Norwich Festival, May 2025</p><figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/1122828226?app_id=122963" width="426" height="240" frameborder="0" allow="autoplay; fullscreen; picture-in-picture; clipboard-write; encrypted-media; web-share" referrerpolicy="strict-origin-when-cross-origin" title="Ginnaye_NN_2025"></iframe><figcaption>Jinn-Ginnaye performance at Norfolk and Norwich Festival, May 2025</figcaption></figure>]]></content:encoded></item><item><title><![CDATA[Photuria 2016-pres]]></title><description><![CDATA[An immersive piece  addressing climate change and loss of biodiversity in the United States through a crime-fiction narrative exploring the disappearance of fireflies. Photuris asks audiences to reflect upon the cultural, ecological, and financial value of the firefly. ]]></description><link>http://b.bhaptic.net/photuria/</link><guid isPermaLink="false">5db6b60c66c1b50d5f8908d5</guid><category><![CDATA[Immersive]]></category><category><![CDATA[Installation]]></category><category><![CDATA[Research]]></category><dc:creator><![CDATA[Kirk Woolford]]></dc:creator><pubDate>Mon, 19 Apr 2021 08:36:00 GMT</pubDate><media:content url="http://b.bhaptic.net/content/images/2019/10/Photuris1.jpg" medium="image"/><content:encoded><![CDATA[<img src="http://b.bhaptic.net/content/images/2019/10/Photuris1.jpg" alt="Photuria 2016-pres"><p>An immersive piece  addressing climate change and loss of biodiversity in the United States through a crime-fiction narrative exploring the disappearance of fireflies. Photuris asks audiences to reflect upon the cultural, ecological, and financial value of the firefly. </p><p>• Fireflies are a part of our biodiversity heritage and are iconic insects that have been the subject of much investigation in the sciences, an inspiration in the arts and a part of local cultures, folklores and traditions because of their ability to produce light.</p><p>• Fireflies have been a source of ecotourism revenue for many communities in different parts of the world and have the potential to bring similar benefits to other local communities. Fireflies and their natural habitats also enhance quality of life and contribute to economies through the promotion of aesthetically pleasing landscapes that have greater appeal.</p><p>• Fireflies are bio-indicators of the health of the environment and are declining across the world as a result of degradation and loss of suitable habitat, pollution of river and water systems, increased use of pesticides in agro-ecosystems, non-regulated commercial harvesting and increased ecological light pollution in areas of human habitation. </p><p>While observing firefly behavior, several naturalists have noted that females of the genus Photuris are carnrivorous. Many have discovered this by trying to keep groups of fireflies alive overnight in the same container. In the morning one usually finds one Photuris female and bits and pieces of all the rest. </p><p></p><figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/369275154?app_id=122963" width="240" height="240" frameborder="0" allow="autoplay; fullscreen" allowfullscreen title="Foturis"></iframe></figure><p></p><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2021/04/image-8.png" class="kg-image" alt="Photuria 2016-pres"><figcaption><a href="https://www.inaturalist.org/projects/fireflyers-international">https://www.inaturalist.org/projects/fireflyers-international</a></figcaption></figure>]]></content:encoded></item><item><title><![CDATA[vRSP: virtually (Re)Sounding Place, 2018-2020]]></title><description><![CDATA[The vRSP project explores new, responsive, immersive, and interactive methods of experiencing a performance, as well as allowing users to explore the unique heritage places for which these performances have been created. ]]></description><link>http://b.bhaptic.net/vrsp/</link><guid isPermaLink="false">5bf5514566c1b50d5f890800</guid><category><![CDATA[Director]]></category><category><![CDATA[Performance]]></category><category><![CDATA[VR/AR/XR]]></category><category><![CDATA[Research]]></category><category><![CDATA[Immersive]]></category><dc:creator><![CDATA[Kirk Woolford]]></dc:creator><pubDate>Sun, 18 Apr 2021 20:35:00 GMT</pubDate><media:content url="http://b.bhaptic.net/content/images/2018/09/vRSP_sing-2.png" medium="image"/><content:encoded><![CDATA[<figure class="kg-card kg-image-card kg-width-full"><img src="http://b.bhaptic.net/content/images/2018/09/StGiles_pano.png" class="kg-image" alt="vRSP: virtually (Re)Sounding Place, 2018-2020"></figure><hr><img src="http://b.bhaptic.net/content/images/2018/09/vRSP_sing-2.png" alt="vRSP: virtually (Re)Sounding Place, 2018-2020"><p>More detailed information about the project can be found on the project website at <a href="http://resounding.place/">http://resounding.place/</a></p>
<p>The vRSP network explores new forms of immersive performance experience in sites of historical and cultural import. It will start by examining new works by composer Michael Price who has been exploring the use of 360 video and audio to document performances in National Trust properties. The research network will examine how these performances have been recorded and converted to 360 videos, and offer new solutions using ongoing research from the University of Surrey’s Institute of Sound Recording, Center for Vision, Speech and Signal Processing, 5G Innovation Centre, and Digital Media Arts programmes.</p>
<figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/328280561?app_id=122963" width="640" height="360" frameborder="0" allow="autoplay; fullscreen" allowfullscreen title="vRSP Work in Progress, 2019"></iframe></figure><p>Many spatial experiences are not well represented through current 360 video and immersive technologies. Despite Ambisonics being adopted as the de-facto soundfield capture and manipulation standard by Facebook and other 3D-audio content generation platforms, current 360 video experiences lack spatial audio detail and the ability to explore the soundfield convincingly. This leads to a considerably reduced immersive experience. While equipment manufacturers currently focus on increased image quality through higher pixel counts, there is currently a gap in both documentation and content of high-quality spatial audio which has been captured outside of game-engine based experiences.</p>
<p>The research network will explore how new forms of immersive experience can be created to allow future audiences to move through the space during the performance and explore the relationship between the performance and location. The network will also examine how extra layers or “maps” of the site can be added to allow the audience to explore both the history and significance of the site as well as the process of creating the work. Finally, the vRSP network will explore how an immersive experience can go beyond what is possible in a live performance to allow the experience to respond to the presence of the audience.</p>
<p>vRSP is developing outputs which are exemplars of a wholly new genre of art work, designed for virtual space but based on notable and rare historic places for which music will be written to integrate musical and social memory, and exploit unique architectural and acoustic environments. Live performances of newly composed works will be captured and disseminated to viewers and listeners as VR content through the application of a series of novel production techniques. These will enhance and address specific difficulties presented by current technologies, as described below, by applying interdisciplinary knowledge combining research from human perception, neuroscience, musical composition, animation, visual surface-mapping techniques and visual signal processing, stereoscopic 360 degree video and three- dimensional audio recording and reproduction methods. The network draws together experienced practitioners from all fields to inform the creation of a new work that will portray the intrinsic qualities of the historic performance space through site specific live performance and a combination of contemporary video- and animation-based techniques for VR generation.</p>
<p>The core project team includes Michael Price, one of the UK's most sought-after composers, known for Emmy-winning compositions for BBC's Sherlock and Unforgotten as well as work on films including Peter Jackson's The Lord of the Rings trilogy, Richard Curtis' Love Actually, Bridget Jones: The Edge of Reason and Alfonso Cuaron's Children of Men. Alongside his Film/TV work, and signed with the label Erased Tapes Records, Price is releasing a new album recorded on six locations across the UK, ranging from a Tudor mansion to tunnels within the White Cliffs of Dover. Each piece was recorded onsite and composed in response to the site and its history. After enquiring with industry colleagues about methods of recording and reproducing immersive audio, Price began consulting directly with equipment manufacturers in search of solutions, becoming an associate member of the BAFTA-VR Advisory Group. vRSP will allow Price to work together with University of Surrey academics Kirk Woolford and Tony Myatt. Woolford is an expert on interactive media content creation with more than 150 interactive works created, 30 years' experience developing immersive performances, including 20 years of VR performances, and 8 years' work with AR. Myatt is the head of Music and Media at the University of Surrey, founding director of the University of York's Music Research Centre, one of the foremost authorities on spatial audio. All three core partners are active creative practitioners and perform their works world-wide.</p>
<p></p><p></p><figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/284358463?app_id=122963" width="640" height="360" frameborder="0" title="vRSP Volumetric capture and Shader Graph test" webkitallowfullscreen="" mozallowfullscreen="" allowfullscreen></iframe></figure><p></p><figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/284358253?app_id=122963" width="640" height="360" frameborder="0" title="vRSP Audio analysis and rendering test" webkitallowfullscreen="" mozallowfullscreen="" allowfullscreen></iframe></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/DSC_7438.jpg" class="kg-image" alt="vRSP: virtually (Re)Sounding Place, 2018-2020"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/DSC_7278.jpg" class="kg-image" alt="vRSP: virtually (Re)Sounding Place, 2018-2020"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/DSC_7249.jpg" class="kg-image" alt="vRSP: virtually (Re)Sounding Place, 2018-2020"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/DSC_7538.jpg" class="kg-image" alt="vRSP: virtually (Re)Sounding Place, 2018-2020"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/DSC_7557-1.jpg" class="kg-image" alt="vRSP: virtually (Re)Sounding Place, 2018-2020"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/IMG_0861.jpg" class="kg-image" alt="vRSP: virtually (Re)Sounding Place, 2018-2020"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/DSC_7518.jpg" class="kg-image" alt="vRSP: virtually (Re)Sounding Place, 2018-2020"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/vRSP_sing-1.png" class="kg-image" alt="vRSP: virtually (Re)Sounding Place, 2018-2020"></figure>]]></content:encoded></item><item><title><![CDATA[Cambridge Arts Network: Virtual and Augmented Realities]]></title><description><![CDATA[When it is too dangerous to bring an audience together in a single space, how can we use creative technologies to either bring audiences together?]]></description><link>http://b.bhaptic.net/can-3/</link><guid isPermaLink="false">5fd32cbf66c1b50d5f8909c8</guid><category><![CDATA[Teaching]]></category><dc:creator><![CDATA[Kirk Woolford]]></dc:creator><pubDate>Fri, 11 Dec 2020 08:37:40 GMT</pubDate><content:encoded><![CDATA[<p>During times when it is too dangerous to bring an audience together in a single space, how can we use creative technologies to either bring audiences together into a virtual space, or to leave a virtual performance or object in a space for audiences to come and go over time? This workshop will introduce participants to new forms of virtual and augmented performance. We will explore Unity3D’s XR toolkit and discuss challenges of working with headsets and mobile phones.</p><p>Dr Kirk Woolford, kirk.woolford@aru.ac.uk</p><hr><p><strong>What is XR:</strong></p><p>VR is immersing people into a completely virtual environment. AR is creating an overlay of virtual content, but can’t interact with the environment. MR is a mix, which creates virtual objects capable of interacting with the actual environment. XR brings all these (AR, VR, MR) together under one term.</p><p>Virtual Reality Society: <a href="https://www.vrs.org.uk/virtual-reality/what-is-virtual-reality.html">https://www.vrs.org.uk/virtual-reality/what-is-virtual-reality.html</a></p><p>Wired's Guide: <a href="https://www.wired.com/story/wired-guide-to-virtual-reality/">https://www.wired.com/story/wired-guide-to-virtual-reality/</a></p><p>AR vs VR: <a href="https://uk.pcmag.com/virtual-reality/86123/augmented-reality-ar-vs-virtual-reality-vr-whats-the-difference">https://uk.pcmag.com/virtual-reality/86123/augmented-reality-ar-vs-virtual-reality-vr-whats-the-difference</a></p><hr><p><strong>Hardware:</strong></p><p>Oculus (Facebook): <a href="https://www.oculus.com">https://www.oculus.com</a></p><p>HTC: <a href="https://www.vive.com/uk/">https://www.vive.com/uk/</a></p><p>Valve (Steam): <a href="https://store.steampowered.com/vrhardware/">https://store.steampowered.com/vrhardware/</a></p><p>Apple: <a href="https://www.apple.com/augmented-reality/">https://www.apple.com/augmented-reality/</a></p><hr><p><strong>Authoring Tools:</strong></p><p>Unity: <a href="https://unity.com">https://unity.com</a></p><p>Unreal: <a href="https://www.unrealengine.com/">https://www.unrealengine.com/</a></p><p>Apple Reality Composer: <a href="https://developer.apple.com/augmented-reality/tools/">https://developer.apple.com/augmented-reality/tools/</a></p><p>Adobe Aero: <a href="https://www.adobe.com/products/aero.html">https://www.adobe.com/products/aero.html</a></p><hr><p><strong>Publishing Platforms:</strong></p><p>Vive Arts: <a href="https://arts.vive.com/uk/our-mission/">https://arts.vive.com/uk/our-mission/</a></p><p>STEAM: <a href="https://store.steampowered.com/vr/">https://store.steampowered.com/vr/</a></p><p>Oculus: <a href="https://www.oculus.com/experiences/quest/">https://www.oculus.com/experiences/quest/</a></p><p>Epic Games: <a href="https://www.epicgames.com/store/en-US/browse?q=VR&amp;sortBy=relevance&amp;sortDir=DESC&amp;pageSize=30">https://www.epicgames.com/store/en-US/browse?q=VR&amp;sortBy=relevance&amp;sortDir=DESC&amp;pageSize=30</a></p><p></p><p>Google closing "virtual field trips": <a href="https://xblog.google/outreach-initiatives/education/expanding-google-arts-and-culture-expeditions/#:~:text=That%20spirit%20of%20possibility%20also,about%20cultures%20unlike%20their%20own">https://xblog.google/outreach-initiatives/education/expanding-google-arts-and-culture-expeditions/</a></p><p><a href="https://www.theverge.com/2020/11/13/21564279/google-expeditions-vr-cardboard-tours-shutdown-arts-culture-app-migration">https://www.theverge.com/2020/11/13/21564279/google-expeditions-vr-cardboard-tours-shutdown-arts-culture-app-migration</a></p><hr><p>Click on "Immersive" for project documentation              —&gt; </p>]]></content:encoded></item><item><title><![CDATA[Cambridge Arts Network: Interactive Narrative]]></title><description><![CDATA[How do you tell an open-ended story, where the audience actively engages in direction of the narrative? ]]></description><link>http://b.bhaptic.net/can_2/</link><guid isPermaLink="false">5fd0929d66c1b50d5f890993</guid><category><![CDATA[Teaching]]></category><dc:creator><![CDATA[Kirk Woolford]]></dc:creator><pubDate>Wed, 09 Dec 2020 09:13:43 GMT</pubDate><content:encoded><![CDATA[<p>How do you tell an open-ended story, where the audience actively engages in direction of the narrative? How can you use websites and apps to bring your stories to new audiences? This workshop introduces participants to open and multi-linear screenwriting (i.e. stories with numerous threads). The workshop will also introduce participants to the Unity3D development environment and toolkits for rapidly creating 2D and 3D narratives. We will create an interactive story and share it on Android, iOS and a website.</p><p>Dr Kirk Woolford, kirk.woolford@aru.ac.uk</p><p></p><p><a href="https://eko.com/">https://eko.com/</a></p><p><a href="https://guides.eko.com/">https://guides.eko.com/</a></p><p><a href="https://youtu.be/8ozoIlPyGdA">https://youtu.be/8ozoIlPyGdA</a></p><hr><p><a href="http://twinery.org/">http://twinery.org/</a></p><p><a href="https://twinery.org/wiki/twine2:guide">https://twinery.org/wiki/twine2:guide</a></p><p><a href="http://www.adamhammond.com/twineguide/">http://www.adamhammond.com/twineguide/</a></p><p><a href="https://m.youtube.com/channel/UCJP9KsNr3DEdOVeHUsI0fXQ/videos"><a href="https://www.youtube.com/watch?v=g7VYL8xqJnQ">https://m.youtube.com/channel/UCJP9KsNr3DEdOVeHUsI0fXQ/videos</a></a></p><hr><p><a href="http://korsakow.com/">http://korsakow.com/</a></p><hr><p></p><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2020/12/linear-narrative-structure.gif" class="kg-image"></figure><h3 id="linear-narrative">Linear Narrative</h3><p></p><p></p><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2020/12/branching-narrative-structure.gif" class="kg-image"><figcaption>Branching Narrative</figcaption></figure><p>A relatively simple but popular narrative structure where the viewer is faced with multiple decisions and each choice affects the route they take through the story – the narrative branching out with different endings depending on the decisions you make, much like a ‘choose your own adventure’ book or game. Depending on how many branches the narrative contains, this type of structure can get very complex, very quickly. However, if you want a structure where the audience really feels like they are in control of the outcome, this will be the perfect fit.</p><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2020/12/fishbone-narrative-structure-1.gif" class="kg-image"><figcaption>Fishbone Narrative</figcaption></figure><p>With a traditional linear structure running through its core, the Fishbone narrative allows viewers to veer off and explore the sub-stories of its tale, but always returns them to the main thread of its story. A structure suitable for those not looking to push boundaries with their film (it doesn’t feel too detached from a linear narrative), but still wanting to add more immersion than is possible with traditional filmmaking. With this structure you still get a great deal of control over the route a viewer takes through your project.</p><p></p><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2020/12/parallel-narrative-structure-1.gif" class="kg-image"><figcaption>Parallel or Diamond Narrative</figcaption></figure><p>Somewhat of a blend between a traditional linear narrative and a branching structure, the Parallel format means viewers are presented with choices in the story and although these decisions alter the route they take, they always return to the main narrative thread for pivotal moments. More complex and interactive than the Fishbone structure, but not as loose as the Branching approach, the Parallel narrative gives the impression of a ‘choose-your-own-adventure’ whilst still allowing moments of controlled guidance through your story.</p><p></p><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2020/12/threaded-narrative-structure-1.gif" class="kg-image"><figcaption>Threaded Narrative</figcaption></figure><p>A structure often preferred for  telling documentary stories through multiple points-of-view. Threads can link together or stay totally separate. The course of the plot does not follow a single path in this form of structure. Rather, the story is comprised of a number of different threads that develop largely independently</p><p></p><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2020/12/concentric-narrative-structure.gif" class="kg-image"><figcaption>Concentric Narrative</figcaption></figure><p>With a name that suggests a structure which orbits around a shared central point, it’ll be no surprise to discover that this formats revolves around one main hub which contains multiple entry points to different threads of the story. Viewers can choose which path they take, in whatever order they fancy but they always return to this core area. Though this format is relatively easy to set-up and provides a great deal of freedom and interactivity, this structure does mean you relinquish a lot of the control over what your audience views and although they might take in a lot of information, they might not get the ‘journey’ the other formats provide.</p><p></p>]]></content:encoded></item><item><title><![CDATA[Cambridge Arts Network: Performing through Streaming Media]]></title><description><![CDATA[ an historical overview of performing at a distance, discuss best practices for working with remote audiences, and introduce the software, cameras, microphones, and techniques most frequently used in 2020.]]></description><link>http://b.bhaptic.net/can_1/</link><guid isPermaLink="false">5fcd3fc866c1b50d5f890934</guid><dc:creator><![CDATA[Kirk Woolford]]></dc:creator><pubDate>Mon, 07 Dec 2020 09:33:30 GMT</pubDate><content:encoded><![CDATA[<p>Streaming has become ubiquitous during the Covid-19 pandemic, but are you aware that video streaming has been used in performance for more than 50 years and the telephone was originally designed to transmit live opera performances? This workshop will give you an historical overview of performing at a distance, discuss best practices for working with remote audiences, and introduce the software, cameras, microphones, and techniques most frequently used in 2020.</p><p>Kirk Woolford: kirk.woolford@aru.ac.uk</p><hr><p><strong>History</strong></p><p>A new stage age: why theatres should embrace digital technology (2010): <a href="https://www.theguardian.com/stage/theatreblog/2010/mar/23/stage-theatre-digital-technology-ished">https://www.theguardian.com/stage/theatreblog/2010/mar/23/stage-theatre-digital-technology-ished</a></p><p>Electronic Cafe International (Hole in Space): <a href="http://www.ecafe.com/museum/history/ksoverview2.html">http://www.ecafe.com/museum/history/ksoverview2.html</a></p><p>Telematic Dreaming, Paul Sermon: <a href="http://www.paulsermon.org/dream/">http://www.paulsermon.org/dream/</a></p><p>cyberSM: <a href="https://b.bhaptic.net/cybersm/">https://b.bhaptic.net/cybersm/</a></p><p>Diller + Scofidio Refresh: <a href="https://www.diaart.org/exhibition/exhibitions-projects/diller-scofidio-refresh-web-project/tim-rollins-and-k-o-s-prometheus-bound-web-project/home.html">https://www.diaart.org/exhibition/exhibitions-projects/diller-scofidio-refresh-web-project/tim-rollins-and-k-o-s-prometheus-bound-web-project/home.html</a></p><p>Liveform Telekinetics: <a href="https://waag.org/en/project/liveform-telekinetics">https://waag.org/en/project/liveform-telekinetics</a></p><p>Waag Connected: <a href="http://connected.waag.org/artistinresidence.html">http://connected.waag.org/artistinresidence.html</a></p><p>The Virtual Embrace: <a href="http://telematic.walkerart.org/overview/index.html">http://telematic.walkerart.org/overview/index.html</a></p><p>Telematic Sonic Performances: <a href="https://thesampler.org/guest-editor/telematic-sonic-performance-part-1/">https://thesampler.org/guest-editor/telematic-sonic-performance-part-1/</a></p><p>An overview of Telepresence: <a href="https://amedleyofpotpourri.blogspot.com/2018/06/telepresence.html">https://amedleyofpotpourri.blogspot.com/2018/06/telepresence.html</a></p><p></p><hr><p><strong>Current Projects</strong></p><p>ACE Digital R&amp;D Fund: <a href="https://www.artscouncil.org.uk/creative-media/digital-rd-fund-arts">https://www.artscouncil.org.uk/creative-media/digital-rd-fund-arts</a></p><p>Digital R&amp;D Fund for the Arts project Archive: <a href="https://webarchive.nationalarchives.gov.uk/20161103173438uo_/http://artsdigitalrnd.org.uk/projects/">https://webarchive.nationalarchives.gov.uk/20161103173438uo_/http://artsdigitalrnd.org.uk/projects/</a></p><p>Coney, Better Than Life <a href="https://www.theguardian.com/stage/theatreblog/2014/jul/01/coney-no-island-streamed-theatre-audiences">https://www.theguardian.com/stage/theatreblog/2014/jul/01/coney-no-island-streamed-theatre-audiences</a></p><p>ISEA New Media Performance: <a href="http://www.isea2013.org/events/new-media-performance/">http://www.isea2013.org/events/new-media-performance/</a></p><p></p><hr><p><strong>Streaming Software:</strong></p><p>Open Broadcaster Software: <a href="https://obsproject.com">https://obsproject.com</a></p><p>Zoom Webinars: <a href="https://zoom.us/webinar">https://zoom.us/webinar</a></p><p>BlueJeans Events: <a href="https://www.bluejeans.com/products/events">https://www.bluejeans.com/products/events</a></p><p>Max/MSP Networking: <a href="https://cycling74.com/tutorials/networking-max-talking-to-max">https://cycling74.com/tutorials/networking-max-talking-to-max</a></p><p>Open Sound Control: <a href="http://opensoundcontrol.org/introduction-osc">http://opensoundcontrol.org/introduction-osc</a></p><p>Jack (Synchonising Audio) <a href="https://jackaudio.org">https://jackaudio.org</a></p><p>NetJack: <a href="https://ccrma.stanford.edu/book/export/html/2835">https://ccrma.stanford.edu/book/export/html/2835</a></p><p>Sndio: <a href="https://sndio.org">https://sndio.org</a></p><hr><p><strong>Streaming Hardware:</strong></p><p>Elgato Cam Link: <a href="https://www.amazon.co.uk/Elgato-Cam-Link-Broadcast-camcorder/dp/B07K3FN5MR?ref_=ast_sto_dp">https://www.amazon.co.uk/Elgato-Cam-Link-Broadcast-camcorder/dp/B07K3FN5MR?ref_=ast_sto_dp</a></p><p>Elgato HD60 S+ Capture Card: https://www.amazon.co.uk/1080p60-Zero-Lag-Passthrough-Ultra-Low-Technology/dp/B07XB6VNLJ?ref_=ast_sto_dp</p><p>Blackmagic Intensity Pro 4K: https://www.jigsaw24.com/products/blackmagic-intensity-pro-4k-x884aak#specifications-video-inputs</p><p>Blackmagic Design Web Presenter: <a href="https://www.jigsaw24.com/products/blackmagic-design-web-presenter-x709aar">https://www.jigsaw24.com/products/blackmagic-design-web-presenter-x709aar</a></p><p></p><p></p>]]></content:encoded></item><item><title><![CDATA[Jinn Ginnaye, 2014-pres]]></title><description><![CDATA[Jinn-Ginnaye is a practice-based research project developed in parallel with the live Choreomusical work, Jinn. Jinn-Ginnaye asks how the constraints of Islamic culture can lead to new forms of creation and expression.]]></description><link>http://b.bhaptic.net/jinn-ginnaye/</link><guid isPermaLink="false">5bf5514566c1b50d5f890803</guid><category><![CDATA[Director]]></category><category><![CDATA[VR/AR/XR]]></category><category><![CDATA[Research]]></category><category><![CDATA[Immersive]]></category><dc:creator><![CDATA[Kirk Woolford]]></dc:creator><pubDate>Tue, 01 Sep 2020 20:46:00 GMT</pubDate><media:content url="http://b.bhaptic.net/content/images/2018/09/Trim_Jinn.jpg" medium="image"/><content:encoded><![CDATA[<img src="http://b.bhaptic.net/content/images/2018/09/Trim_Jinn.jpg" alt="Jinn Ginnaye, 2014-pres"><p>Jinn-Ginnaye is a practice-based research project developed in parallel with the live Choreomusical work, Jinn. Impetus for Jinn-Ginnaye came from experiences of censorship when the Interational Society of Electronic Arts (ISEA) conference was held in Dubai in 2015, and all images of womens' bodies had to be removed from public presentations. Jinn-Ginnaye asks how the constraints of Islamic culture can lead to new forms of creation and expression.</p>
<p>Jinn-Ginnaye is an exploration of movement in place. It is a collection of dance pieces exploring issues of bringing western dance performance to the United Arab Emirates, where local modesty laws influence how women can be shown in public. The pieces use video compositing, motion capture, and Virtual Reality techniques to remove the body of the dancer, but leave behind the dance, and the traces of the desert in which it was created.</p>
<figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/328281911?app_id=122963" width="640" height="360" frameborder="0" allow="autoplay; fullscreen" allowfullscreen title="Jinn-Ginnaye REF Draft"></iframe></figure><p>The Jinn project was initiated by Carlos Guedes is interested in exploring the unique sounds of movement in the desert surrounding Abu Dhabi. He had been experimenting with methods of capturing the sound of wind across the dunes when he noticed how various forms and densities of sand created vary different sounds as he walked through them. In order to explore this further, Guedes invited a number of dancers to go out into the dunes with him, and they experimented with the best forms of movement to create sound. While I was in Abu Dhabi to make a site-specific motion capture piece for the ISEA 2014 conference, Carlos invited me to take one of my inertial motion capture suits into the desert so we could capture both the sound and the movement of a dancer in the desert.</p>
<p>United Arab Emirate modesty laws place restrictions on depictions of the human body in public. I decided to use the UAE restrictions as a creative tool, and developed software to create a “sand dancer” which would not offend our hosts in the UAE, but would tap into local legends of Jinn, and the older Ginnaye linked to the term Genius Loci, or spirit of a place. The sand dancer needed to reform and scatter – taking human form only long enough to be recognized, then blowing like sand in the desert wind.</p>
<p>Over the course of 2 years, we built up a collection of audio, video and movement recordings together with a 360 spherical images of the capture locations. I developed custom software using the Unity3D game engine, Google Cardboard, Samsung Gear VR and the Oculus Rift DK2 to place these sand dancers in the middle of the middle of the Rub Al Khali desert. Through these VR tools, we were able to and to allow audiences to instantly travel to one of the most inhospitable locations on the planet, and meet a representation of the spirit of the place.</p>
<p>During performances of Jinn in Abu Dhabi, audiences saw a live dancer perform in front of a video projection filmed in the Rub Al Khali desert. After several performances, we distributed Samsung Gear VRs, and iPhones with Google Cardboard viewers. We put the dancer into an inertial motion capture suits and performed a sand dance where the dancer’s movements were transmitted to the VR viewers and the audience could see her projected in 360 degree stereo into the remote location. The audience reported that the VR viewers gave the performance a more immediate context. They felt a clearer, stronger, link to the original location after viewing it in VR, but none of the audience members watched the entire performance in the immersive environment. They all chose to take off the 3D viewers, and watch the live dancer with in front of them. Two audience member reported that the sand dancers felt like a gimmick when viewed through the VR display, but viewing the desert location through the VR goggles changed their experience of the whole performance. They felt a greater degree of presence. Their experience of the performance was enhanced by this new form of photography, but they felt the presence of the live performer was stronger when she stood on the stage in front of them.</p>
<figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/image-12.png" class="kg-image" alt="Jinn Ginnaye, 2014-pres"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/image-13.png" class="kg-image" alt="Jinn Ginnaye, 2014-pres"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/image-11.png" class="kg-image" alt="Jinn Ginnaye, 2014-pres"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/image-8.png" class="kg-image" alt="Jinn Ginnaye, 2014-pres"><figcaption>Image, ©2016, Kirk Woolford</figcaption></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/image-14.png" class="kg-image" alt="Jinn Ginnaye, 2014-pres"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/image-15.png" class="kg-image" alt="Jinn Ginnaye, 2014-pres"></figure>]]></content:encoded></item><item><title><![CDATA[Immersed in Nature 2019]]></title><description><![CDATA[<p>An introduction to Virtual/Augmented Reality wellness apps presenting "natural" environments prepared for the Valuing Nature annual conference 2019 at the Royal Society, London</p><figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/368669004?app_id=122963" width="426" height="240" frameborder="0" allow="autoplay; fullscreen" allowfullscreen title="Woolford_Immersed_Nature"></iframe></figure><p>The video presents an overview of meditation apps, which attempt to place their users in natural environments – assuming that "natural" experiences must be calming. It includes</p>]]></description><link>http://b.bhaptic.net/immersed_nature/</link><guid isPermaLink="false">5db6b38466c1b50d5f8908c4</guid><category><![CDATA[Immersive]]></category><category><![CDATA[Research]]></category><category><![CDATA[VR/AR/XR]]></category><dc:creator><![CDATA[Kirk Woolford]]></dc:creator><pubDate>Mon, 28 Oct 2019 09:27:18 GMT</pubDate><media:content url="http://b.bhaptic.net/content/images/2019/10/Imm_Thumb.png" medium="image"/><content:encoded><![CDATA[<img src="http://b.bhaptic.net/content/images/2019/10/Imm_Thumb.png" alt="Immersed in Nature 2019"><p>An introduction to Virtual/Augmented Reality wellness apps presenting "natural" environments prepared for the Valuing Nature annual conference 2019 at the Royal Society, London</p><figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/368669004?app_id=122963" width="426" height="240" frameborder="0" allow="autoplay; fullscreen" allowfullscreen title="Woolford_Immersed_Nature"></iframe></figure><p>The video presents an overview of meditation apps, which attempt to place their users in natural environments – assuming that "natural" experiences must be calming. It includes screen recording from within a number of VR wellness apps.</p><p></p><p><a href="https://valuing-nature.net/ValNat19/overview">https://valuing-nature.net/ValNat19/overview</a></p><p><a href="https://valuing-nature.net/cinema">https://valuing-nature.net/cinema</a></p>]]></content:encoded></item><item><title><![CDATA[Jinn, 2014-2018]]></title><description><![CDATA[Jinn is a choreomusical work, set in the desert of Rub’ al Khali, exploring dance and music in the UAE, and exchanges between Islamic and Western cultures.]]></description><link>http://b.bhaptic.net/jinn/</link><guid isPermaLink="false">5bf5514566c1b50d5f890801</guid><category><![CDATA[Performance]]></category><category><![CDATA[Director]]></category><category><![CDATA[Immersive]]></category><dc:creator><![CDATA[Kirk Woolford]]></dc:creator><pubDate>Wed, 10 Oct 2018 21:06:00 GMT</pubDate><media:content url="http://b.bhaptic.net/content/images/2018/11/DSCF4620-by-Waleed-Shah.jpg" medium="image"/><content:encoded><![CDATA[<img src="http://b.bhaptic.net/content/images/2018/11/DSCF4620-by-Waleed-Shah.jpg" alt="Jinn, 2014-2018"><p><strong>Jinn</strong> — Jinn is a choreomusical work, set in the desert of Rub’ al Khali, exploring dance and music in the United Arab Emirates, and exchanges between Islamic and Western cultures. </p><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/DSCF4995-by-Waleed-Shah.jpg" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Photo by Waleed Shah, NYU Abu Dhabi Arts Centre, 2018</figcaption></figure><p>On Wednesday 10 October 2018, Kirk Woolford, Reader for Digital Media Arts at Surrey University, and Carlos Guedes, Associate Arts Professor NYU, premiered their new piece, “Jinn” at NYUAD Arts Centre’s, in Abu Dhabi.</p>
<p>Using live motion capture, virtual puppetry, live musicians and a multitude of sand, Jinn is a choreomusical work exploring the presentation of Western performance in Arabic cultures. The Arabic term ‘jinn’ refers invisible, sentient beings, of scorching winds and a smokeless fire. Before Islam, they were worshiped as spiritual protectors not only in the Arabian Peninsula, but also in neighbouring areas. For this performance piece, the collaboration draws inspiration from the sound of movement in the desert and images of Jinn in Arab culture.</p>
<figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/Woolford_Jinn_capture.jpg" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Image, ©2016, Kirk Woolford</figcaption></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/IMG_0793.JPG" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Image, ©2016, Kirk Woolford</figcaption></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/IMG_0546.JPG" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Image, ©2016, Kirk Woolford</figcaption></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/image-5.png" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Image, ©2016, Kirk Woolford</figcaption></figure><figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/212900170?app_id=122963" width="640" height="360" frameborder="0" title="Jinn Sand Dance, 2016" webkitallowfullscreen="" mozallowfullscreen="" allowfullscreen></iframe></figure><figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/301625899?app_id=122963" width="640" height="360" frameborder="0" title="Jinn Live Sand Dance, 2018" allow="autoplay; fullscreen" allowfullscreen></iframe></figure><figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/212898842?app_id=122963" width="640" height="360" frameborder="0" title="Jinn Sand Ghost, 2016" webkitallowfullscreen="" mozallowfullscreen="" allowfullscreen></iframe></figure><p>NYU’s agreement with the Emirate of Abu Dhabi to create NYU Abu Dhabi is the outcome of a shared understanding of the essential roles and challenges of higher education in the 21st Century and a common belief in the value of a liberal arts education.</p>
<p>Through Jinn, we wanted to present a wider Western culture to Emiratis. We’ve adressed constraints of showing the female form in a Muslim culture, by ultimately creating a dance piece without a dancer: removing the physical body, and leaving just the sound and trace of the movement, and transforming the performers into creatures of sand and fire.</p>
<p>Jinn is set in an area of the Abu Dhabi desert known as the empty quarter. It  explores the sound of human movement in this environment, as captured by video, audio, and motion recording devices. The piece combines dance, live music performance with multichannel sound diffusion, motion capture and computer-generated graphics.</p>
<p>Guedes and Woolford have developed creative works addressing the perception of human bodily motion since 2003,in pieces such as Will.0.w1sp (Woolford &amp; Guedes, 2005) and Côr (Guedes, Ula li &amp; Woolford, 2003)</p>
<p>More information is available at the NYU Abu Dhabi Arts Centre website at: <a href="https://www.nyuad-artscenter.org/en_US/events/2018/jinn-carlos-guedes/">https://www.nyuad-artscenter.org/en_US/events/2018/jinn-carlos-guedes/</a></p>
<p>Animation: Kirk Woolford<br>
Music: Carlos Guedes</p>
<p>Choreography &amp; dance (live): Kiori Kawai<br>
Choreography &amp; dance (video): Nella Turkki<br>
Flutes &amp; voice: Cristina Ioan</p>
<p>Cinematography: Saguenail<br>
Lighting design: Simon Fraulo<br>
Costume design: Judi Olson<br>
Marimba, vibraphone, &amp; glockenspiel (recording): João Dias<br>
Sound recording &amp; technical assistance: João Menezes<br>
Stage Management: Tegan McDuffie</p>
<p></p><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/jinn-flyer-final-correct-1.jpg" class="kg-image" alt="Jinn, 2014-2018"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/DSCF4605-by-Waleed-Shah.jpg" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Photo by Waleed Shah, NYU Abu Dhabi Arts Centre, 2018</figcaption></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/DSCF4540-by-Waleed-Shah.jpg" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Photo by Waleed Shah, NYU Abu Dhabi Arts Centre, 2018</figcaption></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/DSCF4620-by-Waleed-Shah-1.jpg" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Photo by Waleed Shah, NYU Abu Dhabi Arts Centre, 2018</figcaption></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/WSHB6291-by-Waleed-Shah.jpg" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Photo by Waleed Shah, NYU Abu Dhabi Arts Centre, 2018</figcaption></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/WSHB6332-by-Waleed-Shah.jpg" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Photo by Waleed Shah, NYU Abu Dhabi Arts Centre, 2018</figcaption></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/WSHB6346-by-Waleed-Shah.jpg" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Photo by Waleed Shah, NYU Abu Dhabi Arts Centre, 2018</figcaption></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/DSC_4806.jpg" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Image, ©2016, Kirk Woolford</figcaption></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/IMG_1932.JPG" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Image, ©2016, Kirk Woolford</figcaption></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/Jinn_live_test-1.png" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Image, ©2018, Kirk Woolford</figcaption></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/image-7.png" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Image, ©2016, Kirk Woolford</figcaption></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/IMG_0387.JPG" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Image, ©2016, Kirk Woolford</figcaption></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/IMG_0666.JPG" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Image, ©2016, Kirk Woolford</figcaption></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/IMG_0908.JPG" class="kg-image" alt="Jinn, 2014-2018"><figcaption>Image, ©2016, Kirk Woolford</figcaption></figure><p></p><p></p>]]></content:encoded></item><item><title><![CDATA[Me, the Machine, 2013-2014]]></title><description><![CDATA[Particle dancer excerpt from Imogen Heap's Me the Machine video.]]></description><link>http://b.bhaptic.net/memachine/</link><guid isPermaLink="false">5bf5514566c1b50d5f8907ff</guid><category><![CDATA[Performance]]></category><category><![CDATA[Director]]></category><category><![CDATA[Immersive]]></category><dc:creator><![CDATA[Kirk Woolford]]></dc:creator><pubDate>Fri, 05 Dec 2014 22:29:00 GMT</pubDate><media:content url="http://b.bhaptic.net/content/images/2018/11/MeMachine.jpg" medium="image"/><content:encoded><![CDATA[<figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/137399315?app_id=122963" width="480" height="270" frameborder="0" title="Imogen Heap, Me the Machine particle dance, 2014" webkitallowfullscreen="" mozallowfullscreen="" allowfullscreen></iframe><figcaption>Imogen Heap, Me the Machine particle dance, 2014</figcaption></figure><img src="http://b.bhaptic.net/content/images/2018/11/MeMachine.jpg" alt="Me, the Machine, 2013-2014"><p>Particle dancer excerpt from Me the Machine music video.<br>The full video is available at youtube.com/watch?v=N0lCL2hpRPM</p><p>Video concept: Imogen Heap<br>Director/set design/co-editor: Ersinhan Ersin, (Marshmallow Laser Feast)<br>Co Director and Editor: Leo Fawkes<br>Producer: Marta Sala Font<br>Production management and design: Liz Berry<br>Visual/glove integration, sound: Adam Stark<br>Visual/glove integration, Kelly Snook<br>Visual Artist, Particle man visuals: Kirk Woolford</p><p>Full credits available at youtube.com/watch?v=N0lCL2hpRPM</p><p></p><figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/137397360?app_id=122963" width="480" height="270" frameborder="0" title="Imogen Heap Particle Tests, 2013" webkitallowfullscreen="" mozallowfullscreen="" allowfullscreen></iframe></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/MeMachine_Filming_1.png" class="kg-image" alt="Me, the Machine, 2013-2014"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/MeMachine_Filming_2.png" class="kg-image" alt="Me, the Machine, 2013-2014"></figure>]]></content:encoded></item><item><title><![CDATA[Moments in Place AR Performances, 2011-2013]]></title><description><![CDATA[<p>Moments in Place is a series of site-specific augmented reality performances, created for the Brighton Digital Festival, inviting visitors to consider movement qualities of different locations in the city of Brighton as well as the range of artworks in Brighton's streets.</p><p>Each of the performances were recorded on site using</p>]]></description><link>http://b.bhaptic.net/moments-in-place/</link><guid isPermaLink="false">5bf5514566c1b50d5f890804</guid><category><![CDATA[Director]]></category><category><![CDATA[VR/AR/XR]]></category><category><![CDATA[Research]]></category><category><![CDATA[Performance]]></category><category><![CDATA[Immersive]]></category><dc:creator><![CDATA[Kirk Woolford]]></dc:creator><pubDate>Thu, 12 Sep 2013 10:23:00 GMT</pubDate><media:content url="http://b.bhaptic.net/content/images/2018/09/Home1-9.png" medium="image"/><content:encoded><![CDATA[<img src="http://b.bhaptic.net/content/images/2018/09/Home1-9.png" alt="Moments in Place AR Performances, 2011-2013"><p>Moments in Place is a series of site-specific augmented reality performances, created for the Brighton Digital Festival, inviting visitors to consider movement qualities of different locations in the city of Brighton as well as the range of artworks in Brighton's streets.</p><p>Each of the performances were recorded on site using portable motion capture systems. When phones or tablets are pointed at select urban artworks, a performance is rendered live in 3D allowing the audience to walk around and explore the relationship between the performance and location.</p><p>The original mini-site for this project is located at: <a href="http://www.bhaptic.net/moments/">http://www.bhaptic.net/moments/</a></p><hr><figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/80370446?app_id=122963" width="480" height="270" frameborder="0" title="Moments in Place AR Performances, 2011-13" webkitallowfullscreen="" mozallowfullscreen="" allowfullscreen></iframe></figure><p>Double-tapping on the screen while the app is running brings up menus displaying &quot;what to look for&quot; and &quot;where to look&quot; in order to help find the appropriate artworks. Alternatively, users can select &quot;Just Dance&quot; to view one of the performances without tracking.</p>
<p>The project invites viewers to think about movement qualities of various locations. Some of the locations are quiet and inspired slow, reflective movement while others are very hectic -- full of shoppers and tourists.</p>
<p>You should try to visit at least a selection of locations. However, if this is not possible, or if your are not able to visit Brighton, you can point your phone at any of the webpages in the &quot;what to look for&quot; section, and the phone will play the performance associted with the location.</p>
<p>Alternatively, you can go through the menu and select &quot;Just Dance&quot; to watch one of the performances wherever and whenever you wish.</p>
<figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/shape_pic-4.png" class="kg-image" alt="Moments in Place AR Performances, 2011-2013"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/shape_pic3-6.png" class="kg-image" alt="Moments in Place AR Performances, 2011-2013"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/shape_pic1-5.png" class="kg-image" alt="Moments in Place AR Performances, 2011-2013"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/Screenshot_2013-08-29-11-24-05-4.png" class="kg-image" alt="Moments in Place AR Performances, 2011-2013"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/IMG_4952-4.PNG" class="kg-image" alt="Moments in Place AR Performances, 2011-2013"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/shape_pic23-6.png" class="kg-image" alt="Moments in Place AR Performances, 2011-2013"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/shape_pic14-4.png" class="kg-image" alt="Moments in Place AR Performances, 2011-2013"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/IMG_4990-4.PNG" class="kg-image" alt="Moments in Place AR Performances, 2011-2013"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/shape_pic18-5-1.png" class="kg-image" alt="Moments in Place AR Performances, 2011-2013"></figure>]]></content:encoded></item><item><title><![CDATA[Will.0.W1sp, 2005-2008]]></title><description><![CDATA[Will.0.W1sp is an interactive installation exploring our ability to recognise human motion without human form. It uses particle systems to create characters or “whisps” with their own drifting, flowing movement, but which also follow digitised human movements.]]></description><link>http://b.bhaptic.net/will-0-w1sp/</link><guid isPermaLink="false">5bf5514566c1b50d5f8907fe</guid><category><![CDATA[Installation]]></category><category><![CDATA[Director]]></category><category><![CDATA[Research]]></category><category><![CDATA[Immersive]]></category><dc:creator><![CDATA[Kirk Woolford]]></dc:creator><pubDate>Wed, 12 Oct 2005 21:16:00 GMT</pubDate><media:content url="http://b.bhaptic.net/content/images/2018/09/W_Trio.jpg" medium="image"/><content:encoded><![CDATA[<figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/62573239?app_id=122963" width="436" height="320" frameborder="0" title="Will.0.w1sp Installation Documentation, (2005-8)" webkitallowfullscreen="" mozallowfullscreen="" allowfullscreen></iframe><figcaption>Will.0.w1sp Installation Documentation, (2005-8)</figcaption></figure><img src="http://b.bhaptic.net/content/images/2018/09/W_Trio.jpg" alt="Will.0.W1sp, 2005-2008"><p>Concept/direction: Kirk Woolford<br>Sound: Carlos Guedes<br>Movement: Ailed Izurieta, Patrizia Penev, Marjolein Vogels</p><p><strong>ABSTRACT</strong><br>Will.0.W1sp is an interactive installation exploring our ability to recognise human motion without human form. It uses particle systems to create characters or “whisps” with their own drifting, flowing movement, but which also follow digitised human movements. The central point of the environment is a 2x6m curved screen allowing the whisps to be projected at human scale while giving them enough to space to move and avoid visitors through the use of a combination of video tracking and motion sensors. If visitors move quickly in the space, the particle flow becomes erratic. If visitor moves suddenly toward the whisps they explode. The installation system uses custom particle targets and funnels instead of traditional emitters. It performs realtime motion analysis both on the prerecorded motion capture sequences and the movement of the audience to determine how to route the particles across the scene. The motion vectors are simultaneously fed to an audio system using audio grains to create sound flowing in synch with the imagery. Will.0.w1sp invites visitors to chase after virtual, intangible characters which continually scatter and reform just beyond their reach.</p><figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/62573238?app_id=122963" width="436" height="320" frameborder="0" title="Will.0.w1sp Installation Interaction, 2006" webkitallowfullscreen="" mozallowfullscreen="" allowfullscreen></iframe></figure><p><strong>INTRODUCTION</strong><br>Will-o’-the-Whisp, Irrlicht, Candelas, nearly every culture has a name for the mysterious blue white lights seen drifting through marshes and meadows. Whether they are lights of trooping faeries, wandering souls, or glowing swamp gas, they all exhibit the same behavior. They dance ahead of people, but when approached, they vanish and reappear just out of reach. Will.0.w1sp creates new dances of these mysterious lights, but just as with the originals, when a viewer ventures too close, the lights scatter, spin, spiral then reform and continue the dance just beyond the viewer’s reach.<br>Will.0.W1sp is based on real-time particle systems moving dots like fireflies smoothly around an environent. The particles have their own drifting, flowing movement, but also follow the movements digitised human motions. They shift from one captured sequence to another – performing 30 seconds of one sequence, scattering, then reforming into 1 minute of another sequence by another dancer. In addition to generating the particle systems, the computer watches the positions of viewers around the installation. If an audience member comes too close to the screen, the movement either shifts to another part of the screen or scatters completely.</p><p><strong>OVERVIEW</strong><br>The human visual system is fine-tuned to recognise both movement and other human beings. However, the entire human perceptual process attempts to categorise sensations. Once sensations have been placed in their proper categories, most are ignored and only a select few pass into consciousness. This is why we can walk through a city and be only scarcely aware of the thousands of people we pass, but immediately recognise someone who looks or “walks like” a close friend.</p><p>Will.0 plays with human perception by giving visitors something that moves like a human being, but denies categorisation. It grabs and holds visitors attention at a very deep level. In order to trigger the parts of the visual system tuned to human movement, the movement driving the particles is captured from live dancers using motion capture techniques. While the installation is running, the system decides whether to smoothly flow from one motion sequence into another, make an abrupt change in movement, or to switch to pedestrian motions such as sitting, walking off screen, etc. These decisions are based on position and movement of observers in the space.</p><p>The choreography is arranged into series of movement sequences which flow through specific poses. It is a mix of shifting approaches and retreats as the system plays through its sequences and responds to the presence and movements of the audience.</p><p><strong>SOUND</strong><br>While the installation attempts to present human movement without human beings, it also pulls them into a sonic atmosphere somewhere between installation space and some space outdoor at night. The sound has underlying melodies punctuated by crickets, goat bells, and scruffing sounds from heavy creatures moving in the dark. All this is generated and mixed live by software watching the flow and positions of the particles in the space.</p><p>The sound software is a Max/MSP patch with custom plug-ins written by Carlos Guedes. Because it uses so much cpu-power to generate the sound, it is run on a separate computer. Particle motion data, overall tempo, x, y, and z offsets are all transmitted from the computer controlling the particles to the sound computer using Open Sound Control streams.</p><figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/62596712?app_id=122963" width="436" height="320" frameborder="0" title="Will.0.video, 2006" webkitallowfullscreen="" mozallowfullscreen="" allowfullscreen></iframe></figure><p><strong>MOTION DATA</strong><br>The installation uses a small a database of motion sequences as the base animation data for the particle dancers. Initial tests sequences were created with a Hypervision Reactor 3D motion capture system. However, the system used for these tests required a great deal of “data cleaning”, or manually correcting of motion data. Sometimes it was necessary to adjust 32 points for every frame of capture data. We decided this was just as time-consuming as manually rotoscpoping data, so we created an intelligent rotoscoping package called “Rotocope” shown in “Figure 1”. Rotocope allowed us to manually position markers over key frames in video-taped movement sequences. Rotocope</p><p><strong>PRESENTATION</strong><br>The project is presented on a custom designed screen. The screen is slightly larger taller than a human and curves slightly from center to the edges. This keeps the images at a human scale while completely filling the audience’s field of vision. At the same time, it allows movement at the edges to shift and disperse more quickly than movement in the center. The screen is much wider than standard high-def format (2.2m x 6.0m) to create more space for the image to shift and flow. It is driven by 2 synched video projectors.</p><p><strong>INTERACTION</strong><br>A close reading of the algorithms outlined in sections 4 and 5 will reveal that the entire character of the Will.0.w1sp system revolves around 2 simple values: the maximum velocity and maximum acceleration of the particles. As mentioned in section 3,  increasing the max velocity of the particles allows them to track their motion targets more closely and the particles begin to take on the form of the original dancer. Increasing the acceleration, narrows the width of this dancer. At the same time, decreasing the velocity while increasing the acceleration causes the particles shoot off in nearly straight lines. In effect, it explodes the virtual dancer.</p><p>The Will.0.W1sp interaction system is a separate program which tracks the positions and actions of visitors, and decides how to modulate these two core variables. It also controls where on the 6m curved screen the dancer positions itself.</p><p>Inverse Interaction<br>When many visitors walk into an “interactive” environment, the first thing they do is to walk up to the screen and start waving their arms madly to see the piece “interact”. Will.0.W1sp responds the same as most humans or animals. It scatters and waits for the person to calm down and stay in one place for a minute. It uses inverse interaction. The installation uses interactive techniques not to drive itself, but to calm down the viewer. When the viewers realize the system is responding to  them, they eventually extend the installation the same respect they would extend a live performer.</p><p>Tracking techniques<br>The first version of Will.0.w1sp used an overhead camera and image analysis software to track visitors. This worked as long as the lighting conditions could be controlled. However, it was unstable. When Will.0.w1sp was invited to represent Funcaion Telecom at ARCO’06 in Madrid. A robust tracking system had to be developed which could handle up to 50 people in front of the screen at once, and a continual flow of more than 2,000 people a day.<br>We replaced the overhead camera with an array of passive Infra-Red motion detectors connected to an Arduino[2] open source microcontroller system We developed a control system for this sensor array using the “Processing” [3] open source programming environment together with the processing “oscp5” Open Sound Control library developed by Andreas Schlegel. The tracking system records motion in 9 different zones and calculates max velocity, acceleration, and character position based on positions and overall motion of viewers. It uses its own timers to suddenly ramp up values and slowly return them to normal when it no longer sees any motion in any of its sensory zones.</p><p><strong>CONCLUSION</strong><br>The installation is very successful in its ability to generate dynamic images which walk the line between recognition as human and other. Viewers to the installation often feel they are in the space with another “live” entity which is not quite human. After initial playing around to see the particles scatter, they will often sit for an hour or more to watch them perform.</p><p><strong>ACKNOWLEDGMENTS</strong><br>Will.0.w1sp was funded by a grant from the Amsterdams Fonds voor de Kunst and supported by the Lancaster Institute for the Contemporary Arts, Lancaster University.</p><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/W_Trio-1.jpg" class="kg-image" alt="Will.0.W1sp, 2005-2008"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/Picture-4.jpg" class="kg-image" alt="Will.0.W1sp, 2005-2008"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/Picture-16.jpg" class="kg-image" alt="Will.0.W1sp, 2005-2008"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/Picture-67.jpg" class="kg-image" alt="Will.0.W1sp, 2005-2008"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/09/Picture95.jpg" class="kg-image" alt="Will.0.W1sp, 2005-2008"></figure><p></p><p><strong>REFERENCES</strong><br>[1] C. Guedes, “The m-objects: A small library for musical generation and musical tempo control from dance movement real time.” Proceedings of the International Computer Conference, International Computer Music Association, 2005, 794-797<br>[2] C. Guedes. “Extracting musically-relevant rhythmic information from dance movement by applying pitch-tracking techniques video signal.” Proceedings of the Sound and Music Computing Conference SMC06, Marseille, France, 2006, pp. 25-33<br>[3] <a href="http://www.arduino.cc/">http://www.arduino.cc/</a><br>[4] <a href="http://www.processing.org/">http://www.processing.org/</a></p>]]></content:encoded></item><item><title><![CDATA[Mobstar, 2004-2005]]></title><description><![CDATA[a real-time multi-player online game based on the world of the mafia. Players begin as lowly thugs, doing drug runs and climb the criminal career ladder by accumulating wealth and status to become Godfather of their own gang.]]></description><link>http://b.bhaptic.net/mobstar/</link><guid isPermaLink="false">5bf5514566c1b50d5f89080c</guid><category><![CDATA[Client]]></category><category><![CDATA[Games]]></category><category><![CDATA[Web]]></category><dc:creator><![CDATA[Kirk Woolford]]></dc:creator><pubDate>Fri, 01 Apr 2005 18:28:00 GMT</pubDate><media:content url="http://b.bhaptic.net/content/images/2018/11/Mobstar-3.jpg" medium="image"/><content:encoded><![CDATA[<img src="http://b.bhaptic.net/content/images/2018/11/Mobstar-3.jpg" alt="Mobstar, 2004-2005"><p>A real-time multi-player online game based on the world of the mafia. Players begin as lowly thugs, doing drug runs and climb the criminal career ladder by accumulating wealth and status to become Godfather of their own gang.</p><p>Client: Woedend</p><p>Role: <em>Redesigned interface and MySQL database schemes, added new security, puzzles, and mini games</em></p><p>The game has been relaunched with new web and mobile interfaces at: <a href="https://www.mobstargame.com/">https://www.mobstargame.com/</a></p><p></p><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/Mobstar-1.jpg" class="kg-image" alt="Mobstar, 2004-2005"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/Mobstar-2.jpg" class="kg-image" alt="Mobstar, 2004-2005"></figure>]]></content:encoded></item><item><title><![CDATA[Eccky Flirt, 2005]]></title><description><![CDATA[Eccky was created in August 2005 by Dutch developer Media Republic in association with MSN in the Netherlands.]]></description><link>http://b.bhaptic.net/eccky-flirt/</link><guid isPermaLink="false">5bf5514566c1b50d5f890810</guid><category><![CDATA[Games]]></category><category><![CDATA[Client]]></category><dc:creator><![CDATA[Kirk Woolford]]></dc:creator><pubDate>Thu, 17 Mar 2005 13:17:00 GMT</pubDate><media:content url="http://b.bhaptic.net/content/images/2018/11/Eccky_Title.png" medium="image"/><content:encoded><![CDATA[<img src="http://b.bhaptic.net/content/images/2018/11/Eccky_Title.png" alt="Eccky Flirt, 2005"><p>Eccky was created in August 2005 by Dutch developer Media Republic in association with MSN in the Netherlands. Eccky has characteristics of life simulation and virtual pet games. The gameplay of the first version of Eccky involved a virtual baby, or Eccky, which was born on the basis of information derived from both Eccky user players. Eccky used an AIML chatbot and MSN Messenger for chat between users and the Eccky baby. In 2006, Eccky became an independent company as a subsidiary of Media Republic.</p><p>Client: Woedend for Media Republic/MSN</p><p>Role: Sole programmer developing a mini-game for the Eccky network. </p><p></p><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/Eccky2.png" class="kg-image" alt="Eccky Flirt, 2005"></figure><p>Translation: Eckky is in love!<br>
Except that Eccky doesn't dare to tell the girl or boy that s/he has a crush on them.<br>
You need to pass notes to Eccky's new friend telling him/her that Eccky likes them. But be careful, other the other kids in the class are trying to pester Eccky's heart-throb with their own teasing notes.</p>
<p>The game mechanic is note-throwing. The goal is to throw notes that land on Eccky's friend's desk, while also knocking knocking away the notes from the students.</p>
<p>The game is written in Flash, with live connections to the Microsoft Network (MSN) servers for Eccky's gender, appearance, and experience.</p>
<figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/Eccky-Instructions.png" class="kg-image" alt="Eccky Flirt, 2005"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/Eccky-Anger.png" class="kg-image" alt="Eccky Flirt, 2005"></figure>]]></content:encoded></item><item><title><![CDATA[My Horse and Me, 2003-2004]]></title><description><![CDATA[Delivering the equestrian life, from sports and leisure riding to horse care, My Horse and Me is a unique gaming experience for anyone with an interest or passion for horses.]]></description><link>http://b.bhaptic.net/my-horse/</link><guid isPermaLink="false">5bf5514566c1b50d5f89080b</guid><category><![CDATA[Games]]></category><category><![CDATA[Client]]></category><dc:creator><![CDATA[Kirk Woolford]]></dc:creator><pubDate>Thu, 25 Nov 2004 18:40:00 GMT</pubDate><media:content url="http://b.bhaptic.net/content/images/2018/11/MyHorse.jpg" medium="image"/><content:encoded><![CDATA[<img src="http://b.bhaptic.net/content/images/2018/11/MyHorse.jpg" alt="My Horse and Me, 2003-2004"><p>Delivering the equestrian life, from sports and leisure riding to horse care, <em><em>My Horse and Me</em></em> is a unique gaming experience for anyone with an interest or passion for horses. The game features the most accurate horse models and animations yet realized in a video game, alongside a rewarding game play experience, all set against a backdrop of indoor and outdoor environments. <em><em>My Horse and Me</em></em> has a variety of game play modes and options to give players an authentic experience. The Championship mode lets the player take part in competitions at indoor and outdoor locations around the globe ranging from rustic stables and classical riding schools to world-class tournament locations. A series of mini-games offer a variety of game play experiences alongside rewarding horse care game play and extensive customization. This title also offers both first and third person camera modes, putting riders right in the saddle and creating a perfect training tool for practicing disciplines that riders face in real life.</p><figure class="kg-card kg-embed-card"><iframe width="480" height="270" src="https://www.youtube.com/embed/mXa9flrh23E?feature=oembed" frameborder="0" allow="autoplay; encrypted-media" allowfullscreen></iframe></figure><p>I conducted the market research into &quot;Girl Games&quot;, and co-directed the original design team to create one of the first games released on the Nintendo Wii</p>
<p><strong>Woedend Team:</strong></p>
<p>Game Designers: 		Jochem van der Spek<br>
Marc Schmidt</p>
<p>Project Leaders:			Kirk Woolford<br>
Bas van Berkestijn</p>
<p>Art Direction:				Paul Coops<br>
Jochem van der Spek</p>
<p>3-D Artists:					Jurriaan Hos<br>
Mischa Rootsaert<br>
Mattijs van der Valk<br>
Bart Janssen<br>
Paul Coops<br>
Ben Vergeer</p>
<p>Character Design:		Hans Pieko</p>
<p>Dynamica:					Jochem van der Spek</p>
<p>Technical Research:	Kirk Woolford<br>
Jochem van der Spek</p>
<p>Sound:		 				Soundware Amsterdam</p>
<p>Traffic:						Rianne Zandstra</p>
<figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/51IX-QXrJeL.jpg" class="kg-image" alt="My Horse and Me, 2003-2004"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/51tktQeIM9L.jpg" class="kg-image" alt="My Horse and Me, 2003-2004"></figure><p></p><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/510dDqC9DsL.jpg" class="kg-image" alt="My Horse and Me, 2003-2004"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/517UY-RPXVL.jpg" class="kg-image" alt="My Horse and Me, 2003-2004"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/CAM_04_HORSES_300-1.png" class="kg-image" alt="My Horse and Me, 2003-2004"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/PLEIN-1.jpg" class="kg-image" alt="My Horse and Me, 2003-2004"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/moodstal4.jpg" class="kg-image" alt="My Horse and Me, 2003-2004"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/buitenmood2.jpg" class="kg-image" alt="My Horse and Me, 2003-2004"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/miss_classic-1.jpg" class="kg-image" alt="My Horse and Me, 2003-2004"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/p49.png" class="kg-image" alt="My Horse and Me, 2003-2004"></figure><figure class="kg-card kg-image-card"><img src="http://b.bhaptic.net/content/images/2018/11/MyHorse_game2-6.png" class="kg-image" alt="My Horse and Me, 2003-2004"></figure>]]></content:encoded></item></channel></rss>