AVAssetImageGeneratorCompletionHandler and Swift

Posting this here for my own probable future reference.

I’m picking up Swift in small bursts. Usually where I have a small screen that doesn’t rely on any of the Objective C based library I have within GoVJ.

I had a UITableViewController that I wanted to load thumbnails into from a collection of videos held as an array of AVAssets.

// Configure the cell

let assetUrl = demoContent.objectAtIndex(indexPath.row) as! NSURL
let asset: AVAsset = AVAsset(URL:assetUrl) as AVAsset
let imageGenerator = AVAssetImageGenerator(asset: asset);
imageGenerator.maximumSize = CGSize(width: 640,height: 480)
imageGenerator.apertureMode = AVAssetImageGeneratorApertureModeProductionAperture;

imageGenerator.appliesPreferredTrackTransform = true;

imageGenerator.requestedTimeToleranceAfter = kCMTimeZero

imageGenerator.requestedTimeToleranceBefore = kCMTimeZero

So far so good, everything is basically the same as Objective C.

I then create a CMTime value, that refers to the middle of the video file;

// Create thumbNail at middle of loop

let tVal = NSValue(CMTime: CMTimeMultiplyByFloat64(asset.duration, 0.5))

I then call generateCGImagesAsynchronouslyForTimes. I want to be able to chuck my thumbNail setting code into it’s block handler. This is fairly straight forwards in objective C, and I understand what I’m doing there in creating the block.

However Swift’s block formation eluded me, in particular the way ‘in’ follows the parameters and then you type the code you want to execute.

This is what I have now to set my thumbNails:

imageGenerator.generateCGImagesAsynchronouslyForTimes([tVal], completionHandler: {(_, im:CGImage?, _, _, e:NSError?) in 

if let img = im {
dispatch_async(dispatch_get_main_queue()) {
cell.demoContentThumbNail.image = UIImage(CGImage: img)
}
} else {
print("failed in generating thumbnail")
}
})

return cell

Through the eyes of a child

This weekend we upgraded my wife’s iPad. This meant that my older iPad 3 and her original iPad mini 1 were available for our kids.

They had been using 2012 nexus tablets. Arguably these were from a similar era to the iPads but they lag so much when doing certain tasks now. The iPads do too at times but nowhere near as much. It’s a statement on how well iOS9 can run on older devices perhaps but that’s besides the point of this post.

What was really awesome was setting up both children iCloud accounts, putting them on family sharing and being able to let them just explore their new devices and play.

My eldest was really pleased to be able to iMessage. Both kids spent a good while sending silly photos and picture messages to each other and me and my wife.

I started typing messages back from my Mac. When I told my eldest that was how I was chating he was amazed and had to check it out. We had a chat about how messages are routed through the net, and how it all syncs up on whatever device I’m on.

Log of chat
Log of chat

We played with FaceTime too, which he thinks is awesome.

The thing that really struck me, was seeing all this anew from a child’s point of view. The discovery process they went through with the UI and the capabilities of the device were all found easily through play.

The power of these devices to them is very much in the sharing and communication available to them straight away.

There’s a lot to be said for that.

Hello world !

Hello and welcome.

After much deliberation, I decided I would finally start keeping a blog. This will chronicle my thoughts on development, tech, business and anything else I choose to.

Thank you for reading,

Dave