Last words

The whole process, of packing, removals, and last goodbyes is odd.

It feels a little like being able to attend your own wake. You’re still able to speak-to and hug people, but there is a finality to it all. That hangs in the balance and clouds the conversations.

Some last words are words of
advice, from me to those I love.
I’m off to meet my future, to
explore my potential with my
family on new shores. I want to know they will do the same in my absence. I want them to know that life is short and there to be lived.

I want them to take any sadness and drive forwards on that. Most of all I want them to know they are loved.

Partings and goodbyes

So, the goodbyes start, and they come in waves. It’s impossible to see everyone at once and nor would you want to.

Even those family members you may have had issue with, or found hard work are still a part of you. A part of your identity and childhood and life even if small or diminished now in adulthood. It’s bittersweet, and when all words or communication fails all you really want to say, one last time is, “I love you”.

There’s a realisation with older folk that this goodbye could be the last.

This is always the way though, every time you part with someone. You cannot let that realisation rule your destiny but rather let it inform how you treat people when they’re there.

The future is wireless

“Courage”…

In our latest podcast episode, Waiting for Review, Dave Nott and I briefly discussed wireless headphones. For both of us, it seems the future is wireless, and we kind of ‘get’ the direction that Apple and others have been leading things in by elementing hardware stereo jacks.

I recently sold my Edirol V4 video mixer on eBay. It was analogue, SD resolution, and I hadn’t used it for many years. When I first started “VJing” , in 2004, it was the standard for any VJ to use. I had a twinge of sadness in parting with it, but objectively my app GoVJ does everything I used to use it for with multiple DVD and laptop sources over a decade ago. I’d coded a software version of a hardware product that runs on a device that fits in my pocket. It’s fun living in the future!

Stream all the things

All this set me to thinking about wireless video.

I love AirPlay, and GoVJ supports AirPlay to output the video mix that the user is performing. I’m looking into supporting Chromecast down the line as well, possibly even at the same time as AirPlay if possible to provide dual outputs over wifi from the application.

For  real-time video applications on the desktop, there are two technologies that allow inter-app transmission of video data. On macOS this is Syphon, and on Windows there is a counterpart called Spout. These utilise texture sharing functionality that relies heavily on the OS and graphics card drivers to support. On macOS I understand this leverages the IOSurface object.

This allows different apps to ‘transmit’ their video, with extremely low latency, between each other. For example I can create an audio visualiser that creates pretty particle effects in response to a microphone input, and pipe that video straight through to another piece of software that controls multiple screen outputs and video mapping. This interoperability is extremely powerful. It provides a whole other level of expression and choice on the desktop platform for video artists. It has also created a niche eco-system of apps from separate developers that can all be combined with each other.

What about mobile?

I’m keen for there to be something similar on iOS. I believe it could open up the iPad as a tool for live video artists in a similar fashion. Unfortunately due to sandboxing, and other restrictions, recreating Syphon is impossible. IOSurface on iOS is a private API, disallowed for non-Apple applications.

I’m currently looking at Newtek’s NDI SDK. This allows for encoding video data and transmitting it over wifi.

If iOS apps could support this, presenting available outputs over network via Bonjour for example, then something similar to Syphon could be created. This would be subject to network latency when going between devices. I believe on-device would be limited to the speed possible through the network stack running locally on the device itself. This could mean an iPad running two apps in split-screen could send video data from one to the other. I could have a ‘master’ video mixing application, and swap between a variety of companion video synth/visualiser apps along side, providing their input to the mix.

There would be problems I’m sure. Encoding/decoding like this will thrash the hardware and it may not be possible to do this yet with existing iPad hardware.

It also wouldn’t achieve the low latency that desktop can achieve with texture sharing, but, would it be “good enough” ?

Ultimately the NDI SDK is closed source, and I’m unsure relying on it for something like this would be the best choice. On the other hand, some desktop VJ software may support NDI, and this could be a route towards a wider eco system for video artists across different hardware.

I plan on exploring this further as time allows over this coming year.

Getting to know my customers

JFDI…

Since I started out with GoVJ , there has been one thing I’ve been meaning to do but for one reason or another I’ve put off. That has been surveying, and obtaining some qualitative information on my potential market and customers.

There are a few reasons I haven’t done this. Time, unsureness about what questions to ask, fear of putting my head above the line and talking to people, I already knew my target market, I was a user myself… the list can go on, but it all boils down to not really wanting to experience a little potential discomfort.

So, to kick off 2017 I decided it was time to get into it.

GoVJ Survey January 2017
GoVJ Survey January 2017

Survey Monkey is free for up to 10 questions. It seems to be the go-to for anyone who wants something doing quick. I know it can make research professionals wince for various reasons. For me it was ideal though, it’s quick and easy and lowered my bar to just ****ing doing it.

I wrote out my 10 questions, and then got a good friend of mine, Liam, to sense check and validate what I had wrote.

A little help is an awesome thing

If you want to get anyone to click through to something online, a picture is pretty much mandatory.

Liam is a kick-ass motion graphics artist, and also a dab hand at design. He knocked together a call to action graphic for me that took my app’s icon design and turned it into a megaphone. Whilst I’m capable of making my own graphics, I am often in developer mode rather than designer mode. This helped me just get the survey out without procrastinating. I’d strongly advise developers without the skills or time, to outsource this sort of work.

GoVJ Survey Call To Action
GoVJ Survey Call To Action

Using this graphic, and a simple call for participants. So that I could give something back to people, I combined this with reminding people that they can get some free video content if they sign up to my mailing list.

GoVJ Facebook post and call to action
GoVJ Facebook post and call to action

This was sent to two groups on Facebook, reddit, and I also sent an email out to my mailing list (~150 people) that I had built previously when launching the app.

So what happened?

I was really heartened with the community response. I’ve received 70 responses within 3 days. Whilst it’s unlikely to ever get enough responses in a niche like this to reach a high level of statistical certainty with any insights or conclusions, this is enough to give me some good indications.

There are some early insights I can see from scanning Survey Monkey’s graphs and summarisations:

  • A future macOS product serving the same niche might be a valuable use of my time.
  • There is a narrative in terms of how people are using iOS along with Macs, and their use of PCs also.

I believe I shall need to keep an eye on the market, and having presence on windows UWP might be worth considering in future. Prior to this I had considered Android as being my next best step at diversifying and expanding what I serve.

I plan on exploring this data further later on this week and that may be the subject of another blog post.

Friday links – 02/09/2016

This week’s Friday links contains comic book industry rants, views on management, the future of apps, and trips to Mars…

Die Industry, Die! 

Jude Terror writes about pre-orders and the state of direct comic book industry publishing today. I remember when things changed in the 90s, but I had no idea how this change affected my local comic book shop.

After reading this rant, I wonder if there is a space for a kind of “bandcamp.com for comic books” for independent comic book publishing. I imagine a world where digital publishing is the default, and print-runs of trades or individual issues are closer to the vinyl-releases that bands do for their fans. In this way, maybe it could become possible for a creator to support their work on < 2000 “true fans”, purchasing their digital work for a monthly fee that’s much less than current physical comic book costs.

An open-letter to managers of women

Jason Shen writes a call to all managers to check themselves in their approach to appraising their staff. As a former manager, this rings a few bells. Despite agreeing wholeheartedly with the message, this has made me wonder whether I could have done things any better in the past.

6 scientists “return to earth” after a year in insolation

The crew of an experiment to simulate human interactions and living on Mars talk about the completion of their mission.

“A person can be totally cool one minute and severely annoying the next,” he said in an email. “The little things people do that you’d never notice in real life can make you think about tripping them on the stairs here.”

How to survive the future of apps

Kate Abrosimova writes about App Fatigue in users, chat bots and AI.

Server side Swift VS Everything else… 

Qutheory explore the speed of Swift on the server vs. Go, Python and several other languages. I’m interested to see how this looks in the future, especially with respect to Swift’s string handling speeds.

Friday links – 26/08/2016

Rounding up the top few links I’ve read this week that have really piqued my interest.

A range of tech, development and science for your reading pleasure:

I Never Want To Be Near A Nosulus Rift Again

Niels Broekhuijsen writes about his experience with a VR attachment that emits the aromatic flavours of flatulence. It sounds nauseating, but I do wonder how the technology might evolve to cover sweet smells.

Architect Your iOS App for Easy Backend Replacement – Part II: Currying and Partial Application

This article talks about ways of developing your app so that swapping backends doesn’t result in a complete rewrite. I like the approach. Even when developing smaller applications (as I am at the moment) I think it’s important to think modular.

NASA just made all the scientific research it funds available for free

Loads of research, publicly accessible, no pay-wall. This feels like the way space research should be, as far as possible, available for all of humanity.

Networks all the way down

This blog post from 2014 discusses how so much of modern technology is actually general purpose computing and networks configured to behave as if they were discrete objects.

As someone in his early 30’s, I wonder how younger generations perceive this side of things. My generation definitely had a lot of discrete hardware, but it feels like we were probably the last. Perhaps this explains some of the fetishisation of hardware for things like audio synthesis in recent years.

Facebook’s video editor is embarrassingly old

TechCrunch writes about how out of date Facebook’s video editor is, with a particular focus on how Apple could out-innovate them in this field.

I agree, I feel like Apple’s offering with iMovie is also fairly out of date now. I’ve mooted developing my own video editor recently, and the potential for being ‘sherlocked’ in this space feels really very strong. In any case, it would be nice to see fresher video editing options from the big guys all around.

 

That’s all for this week – have a great weekend!

Friday Links

Friday’s here again. Here are some links, old and new but ones that have interested me this week.

These centre around coding, technology, development and gender:

This fantastic article interviews Genevieve Bell, an Australian anthropologist who works at Intel. She has some fantastic points of view on AI, it’s potential development and how that should include and involve much wider aspects of humanity. A wide level of inclusion across all types of people in this development feels vital to me, after reading this.

This article, published earlier this year, relays some research that looks into github activity and gender. Researchers found that code written by women was approved at a higher rate than code written by men, but only if gender was not known.

  • Sex, shoplifting and scares

    Becca Caddy writes for wareable on her experiences wearing a Mio Alpha 2 heart rate monitor. It makes for some interesting reading, particularly how long her heart rate stayed high after the shoplifting experiment.

  • Making a case for letter case

    John Saito writes an interesting article about how capitalisation can really affect the tone, look and feel of an application. For me I think this has relayed the fact that thinking through these things is important. Especially for applying a consistent ‘house style’ across a whole app and it’s website.

  • Hansel Minutes Podcast, interviewing Stephanie Hurlburt of Binomial

    This podcast has been the best listen I’ve had all week out of all the podcasts I follow. Stephanie and Binomial are up to very interesting work within GPU texture compression. I can see myriad ways that a better compressor, better translation formats, could help support the next wave of VR and AR applications.

Additionally I’m excited about the potential for cross-pollination with video formats and encoding themselves. This could lead to lower latency video for live applications, such as VJing. Obviously, having an app like GoVJ means I have quite an active interest in that.

I loved hearing Stephanie’s approach to development overall; just get stuck in, don’t be afraid of complicated things. I’d recommend this for a listen even if you’re not a GPU/real time graphics enthusiast!

Planning… A week on

My last post was about planning. I did broadly as I said in that post, and planned out development and marketing activity for my latest app.

Since doing so I have encountered a series of blockages against my planned development time. Nothing ever goes to plan, right?

The Blocks:

  • Working with beta versions of iOS. I’ve encountered some bugs and oddities. I’ve had to file my first bug report. This has been quite challenging, and in hindsight I should have expected more of this than I did.
  • My experience. Some of the things I’m doing within this app are new to me, so I’ve had to do some learning along the way. I did account for this with buffers of time in my plan, but it’s still felt tough at times.
  • Bringing my library in. I have a framework for my video mixing engine. This works fine when use in other projects for iPhone apps, but not when dropped into an iOS 10 message extension.
  • I’d planned for development but not administrative tasks.

So what have I done about it?

I really want to ensure I get shipped as soon as is possible, so I’ve tried to take a pragmatic view on blocks.

I’ve chosen work-arounds, and made notes for revisiting those post-release. Work-arounds are not always possible though, and a couple of issues have had to just be ground through. The guiding principle is always based on ensuring release.

I’ve spoken with other developers about some of my issues, drawing from the online community and those I know locally. Sometimes it’s helpful just to bounce things off of someone else, although I’d rather not just treat people like rubber ducks.

Sometimes I switch what I’m working on to another task within the project that can be done instead. It can be good to just change ‘modes’.

If all else fails, I go for a run. It can be easy when working on problems to just keep going and going. After a certain point this rarely results in fixing the issue itself. Scheduling a run in my day, and enforcing cut-off points for transitioning from work->family life are quite essential.

The most successful strategies are those where I take a step back, however much I don’t want to at the time.

Planning as an indie

Erik Person blogged here about planning:

As an indie for two months now, I realize I’m not taking my opportunities to plan like I should. This is a reminder to myself to spend a little extra time planning before tackling a new feature. I don’t need to write down the plan or show it to anyone, but the act of planning will be a significant boost over what I’ve been doing lately.

I am two months into my own indie journey also, and I can relate to this very much.

I plan. I have plans for where I am going and what I am doing… But. A lot of this remains in my head. It stays there until eventually I’ll end up knee deep in too much work. At this point I usually remember to take a step back and go into planning mode.

When I was juggling a full-time job I had to have a proper plan written down. Stepping through it bit by bit was part of how I managed to get my own things shipped in evenings/weekends.

Erik’s post is a timely reminder for me. I have a project I want to get shipped as soon as possible. Whilst I know I’m making good progress, sketching out the key stepping stones and blocks between now and launch is something I really need to sit down and do.

I will probably fall back on the method I’ve used for GoVJ and HoloVid.

This consists of:

  1. With a pad and pen:
    • Write down all the key features and functionality for V1.0.
    • Write down how I want to market and launch it.
    • Write down plans for key beta testing milestones.
  2. Type these lists into a spreadsheet and put each list into order of key milestones on the path to release. If something can’t be done before something else, then that dictates whether its comes first or not.
  3. Estimate time in days or hours for each item on the spreadsheet, along with which week each item is being completed in.
  4. Look for concurrency within the marketing list and the development list activities.

I think concurrency can be important to getting things done solo. What I mean by this is that marketing activities should not be happening after the app is made available in the app store. They should be planned at the start of the project, and begun ahead of the release date.

Some marketing activities can be done in small segments at times where I may not be at my best for coding. For example: I drafted copy for the App Store and my mailing lists whilst on lunch-breaks in my old job and saved them to Evernote. Whilst also saving me time, this has given me a nice browsable copy of everything I did for launch (and beyond) for these things.

As the lists are worked through, I keep a separate spreadsheet tab for logging bugs and crossing them off. Closer to release this list needs to be as close to complete as possible. It’s important to keep in mind whether a bug is really show-stopping or not. If it affects the user, then yes it is. If it only affects my desire for the app to be perfect then it may wait for the next release.

So why haven’t I done this yet?

I’ve been procrastinating on doing this for this project. I suspect my main reason has been that I haven’t had to do it. Being obviously time-poor in my old life pretty much dictated some level of organisation just to get anything done.

So now I know what I’ll be up to this weekend. Cheers for the reminder Erik!

Further round and further in…

Over the last month I have been working on a rebuild of the roboEngine. This is the code that is the core heart of GoVJ and HoloVid.

When I built HoloVid earlier this year, I brought the video mixing code from GoVJ into a static library and kept it mainly as-was. The static library was entirely Objective C based, with a lot of OpenGL boiler plate. My plan was to bring the library and all roboheadz products over to Swift by mid-2017. I didn’t fancy reprogramming all the boiler plate code.

My newest project is Swift based. Where possible I’m trying not to write new Objective C code in my projects, to force myself to learn Swift.

I started bringing Swift code into the library along side the Objective C. This was a bad move – Swift cannot be used in a static library! Had I thought this through, I would have realised. This is related to Swift not having a stable ABI yet, so of course making static libraries with it would be a bad idea. With Swift you should make a dynamic framework instead.

So I was faced with a choice: rebuild the engine sooner than I’d planned, or start adding more Objective C to the base. In the end I decided going all in on Swift was going to fit me better.

swift logo

Fast-forwards a few weeks, and I finally have the entire library working in Swift. This means I can mix multiple layers of video or images with each other in real-time, with custom blending modes. I can also add a filter to each layer (essentially a customer shader).

I feel a lot more comfortable now in how Swift’s general syntax works and things such as delegation and extension. One of my favourite aspects of Swift is how it implements custom setters and getters on variables. This feels very neat. Thomas Hanning’s post on custom properties expands this well.

The process of refactoring has also meant that the engine itself is better laid out, and more efficient. Swift’s handling of Core Foundation objects and their allocation/deallocation seems to be working fine. My overall memory usage appears to have come right down.

I’m now beyond the porting of old code, and have started adding new features. First off the list is an exporting routine that allows me to export compositions to the camera roll. This will enable an export routine for HoloVid, and provide the backbone for the my new app.

Two videos blended

This may not look like much, but I’m very happy with the results. Here are two videos composited into one, using a luma key to drop the darkest colours (the black background) from the top layer.

I’m fully aware I will have to update a load of this code with Swift 3 and future releases. Given how clean my code-base now feels to me though, its effort I’m happy to make as and when it becomes necessary.