i3Factory

Your Iphone, iPad & Android Application Factory

Browsing Posts published by Igor Wolfango Schiaroli

 

CIC_edizioni_internazionali_i3editorial.com

i3Factory Public Medical journals of CIC Edizioni Internazionali.

New customization multi-issue and multi-category of the system and Magazine Publishing Magazines for iPad, iPhone and Android i3editorial

https://itunes.apple.com/us/app/cic-edizioni-internazionali/id808099480?l=it&ls=1&mt=8

This App is free to download and offers quick access to “CIC Edizioni Internazionali” journals.
You can freely browse our publications catalogue, select and download any issue and finally read it with our integrated reader.

 
 

Brevetto _patent_iPhone_curve_display_schermo_curvo

Rumors comes from Bloomberg, one of the world’s leading providers of information on the financial markets, Apple is developing an iPhone with curved screen and new pressure sensors.
Bloomberg also reveals the patent filed by Apple Inc., which makes the point that it is developing new models of the iPhone that most likely will include larger screens with curved glass and advanced sensors can detect the different levels

new-iPhone_curve_display_schermo_curvoTwo models are scheduled for release in the second half of next year would be characterized by larger displays with the glass that curves down at the edges. Sensors capable of distinguishing touches heavy or light on the screen will be incorporated in subsequent models, always  part of Bloomberg rumors (source: http://www.bloomberg.com/news/2013-11-10/apple-said-developing-curved-iphone-screens-enhanced-sensors.html) .

Samsung_galaxi_curve_diplay_schermo-curvoWith these two new models, curved screens of 4.7 inches and 5.5 inches, Apple is expected to approach the size of 5.7-inch Galaxy 3 Not3 that Samsung Electronics Co. launched last September. The South Korean manufacturer last month has launched a Galaxy with curved screen, the latest mobile phone in a variety of sizes and price ranges that Samsung is helping to keep ahead of Apple’s global market share in the sale of mobile phones.

The new smartphone from Apple are still in development and production plans have not been completed, said the source Bloomberg adding that the company would probably drop them going in the third quarter of 2014.

Apple then broke up with her glorious past in the month of September, when he presented two versions of the iPhone iPhone 5s with more advanced features and l ‘iPhone 5c colorful and somewhat lower prices, as part of a strategy that would see her compete on larger markets.

The first iPhone, released in 2007, offered a very functional and innovative touchscreen technology. Apple continues with its history of experimentation and the development of new materials in collaboration with the best suppliers of new technologies that will be able to improve the functions of your device.

iPhone_curve_display_schermo_curvo

Also according to Bloomberg said that Apple would open a new plant in Arizona to make components for its devices.

In conclusion

The fact that Apple could announce two new and different devices no longer so surprising, if we remember what he did with the iPhone and 5c 5s.
Therefore expect other sources that we confirm the introduction of pressure sensors, or rather of a screen that is capable of measuring the force applied on it, so darendere writing and drawing experience even more ‘innovative and fulfilling, and above all creative.

 
 

Logo

Wayap an italian App for planning billboards ads in Italy.
With this app, you ‘can search for the advertising space of WAYAP on the whole Italian territory and streets. You can see both the billboards in the vicinity of their position in close proximity to a specified address, as well as 
 vary at will within the radius that determines how far the signs are to be searched.
Thanks to a system of bookmarks you can keep track of the signs we are interested in and request a quote to WAYAP Srl at any time.
The app also allows you to enter personal notes and photos for each sign, in order to better plan their advertising space.
advertising Outdoor Advertising Signs Signage Billboards 
 
 
 outdoor advertising advertising campaigns
WAYAP is the first Italian company that has designed and engineered the external pace with the third millennium.
badge_apple_store
badge_andoid_store

ANDROID LINK:
https://play.google.com/store/apps/details?id=com.i3factory.apiitalia

 

 
 

ToucHotel_Touch_Hotel-Android

i3Factory upgraded ToucHotel for Android.

Here Google Play download link for Touch Hotel : https://play.google.com/store/apps/details?id=com.touchotel

APP DESCRIPTION:

Want advice from a friend on which hotel to book? With ToucHotel you can read your friends’ reviews, share your favorite hotels within the community and choose from over 260,000 hotels and B&Bs!

★ Download ToucHotel Now, It’s 100% FREE!

Stay in touch with hotel-booking, hotel booking at your fingertips and TOUCH & SLEEP!

 
 

In an interesting article Matthew Panzarino asks, “’cause the look of the design iOS7 is so’ different from the previous” (Source: http://cdn.theapplelounge.com/wp-content/uploads/2013/06/iOS6vsiOS7_icons.png ).

After seeing the presentation of Apple iOS 7 now you know that the icons on the home screen (springboard cd) will change slightly.
Many of the new icons were designed mainly by members of the marketing and communications department at Apple, no more ‘by the development team of app. Jony Ive (now head of Human Interaction Apple) has guided step by step the design team to set the look and the color palette of the icons of the so-called stock app.

 

 

iOS7 Springboard

 

As has really changed the appearance of the icons of apps compared to iOS 6?
Let us see the comparison image created by @ pawsupforu:

ios6-iOS7_icone-confronto_icons-comparison
Note that some icons have been taken directly from OS X (Safari), while others have been completely redesigned (Calendar)

ICONS DIMENSION , from 114px to 120px

iOS7 icone
IOS 7 Guide Freebie PSD

One of the major changes (over the design “flat”) is the change of size of the icons.
The application icons are now 120px iOS 7 (compared to the previous format of 114px) and the radius (Border Radius) is now 27px (20px compared to previous year). With this change comes the need to change the icon size of our app.
Fortunately, the designers Seevi Kargwal he designed the aforementioned iOS icon 7 in PSD format to help facilitate the process of redesign. You can check out more of the work.

Following the link to this web page http://dribbble.com/shots/1111211-IOS-7-Guide-Freebie-PSD you will find 2 ATTACHMENTS:
IOS_7_Guide_freebie_PSD.psd 900 KB
IOS-7-Guide-Full-Size.jpg 300 KB

iOS 7 Icon Rounded Corner Radius

The web site “Cult of Mac” argues that Jony Ive has given to the icons of iOS7 the same rounded corners iMac.

icone iOS7
icone Home – OS7 (Source)

Cult of Mac in his article argues that the new Director of Human Interface, Jony Ive, has redesigned iOS operating system as a multi-layered Parallax.
Ive qindi migrated its design philosophies and the hardware on iOS Messages app icon shows how you can get fecendo so that the corners of the icon messages have the same tapered edges that lie on the products Apple iMac.
The difference is only a small number of pixels that most users will probably never noticed, so Brad Ellis, who first discovered it, he created his own GIF comparison so you can actually see the changes:

Le icone di iOS7 hanno la stessa curvatura degli angoli dell'iMac
The icons of iOS7 have the same curvature of the corners of the iMac Brat Ellis.

In his blog Joel Page detailing the best corner of the icon as you can ‘see in the image below:

icona_ios7_bordo_domensione_radiusSource

The new icon looks like a square with iOS7 dimesioni 120×120 px.

We conclude by considering the fact that the design changes with iOS 7 there sobno important innovations in the design of the icon.

The icons on the iPhone home screen received a slight increase in size to 114px by 57px and 60px and 120 px respectively.

 

template
We have introduced a new golden ratio grid and a new color scheme, much more ‘bright, that you will find included in the PSD file icon App Template This segiuendo link to this page:http://appicontemplate.com/ios7.
App icon Template , a Free Photoshop template is a resource that makes you more faciledisegnare icons. By changing a single object automatically returns you all the different formats required on iOS and Android.

 
 
accastampato magazine , una rivista pubblicata con i3editorial.com
accastampato, a magazine published by i3editorial.com

The accastampato is the first popular science magazine for students designed and built by students.  Published and deleloped with i3editorial.com allows for a high-quality management without  costs.
The project is, in fact, organized by the students of Physics of the “accatagliato” in order to describe in simple terms, frontier research in which they are personally involved.

Download from Newstand iTunes App store:  https://itunes.apple.com/us/app/accastampato/id630566984?mt=8

 
 
Rivista della provincia di Milano
Rivista della provincia di Milano

 


Milano Province Magazine: La Provincia in Casa
has chosen to enter the mobile and tablet market with  i3Factory using the publishing system i3F Editorial, a technologically advanced solution without any cost management. The advanced technology of i3F Editorial allows publishers to publish new issues of the journal or magazine at no cost. It seems unbelievable but it is true.

The version for Apple iOS devices, all iPhone and iPod Touch (iPhone 3, iPhone5, iPhone 5) and iPad (iPad 3 Retina, iPad 4 Retina, iPad 1, iPad2) found at the following link:

https://itunes.apple.com/it/app/la-provincia-in-casa/id606239216?mt=8

The Android version for all smartphones and tablets Android (Samsung, Kindle, Sony, Asus, etc. ..):

https://play.google.com/store/apps/details?id=org.imaginor (GOOGLE PLAY- Android Market)

http://www.amazon.com/Provincia-Casa-Milano-italian-Magazine/dp/B00BUL9THS (Amazon App Store – Kindle fire)

 
 

i3Factory World  accompanies the Ministry of Labour through the i3Editorial Editorial System lands on the App Store.
An easy way to keep up to date on the world of work. Newsletter Cliclavoro is a monthly feature that collects the most important industry news, trends in the labor market, opportunities in Europe, interviews with personalities.

The newsletter is divided into 5 sections:

in opening
deepening
The interview
From Europe
From the Social Network

Keep up to date on market trends, with data and information enriched with links and multimedia content.
To follow in real-time news from the world of work CliComunica download the application.
Apple iPhone and iPad Version: https://itunes.apple.com/it/app/clicomunica/id582587332?mt=8

Android version: https://play.google.com/store/search?q=clicomunica&c=apps

 
 

We often have to update our application with the necessary high-resolution images needed for the new iPad (iPad 3 or iPad 4). Fortunately, the new iPad Mini has maintained the same resolution as the first iPad is 1024 × 768 pixels.
Since it is not always easy to find the documents ufficali Apple, in this article I have again gathered all the information we need to update the icons, the intro, and so on.

First of all let’s start with this handy table:

Device/Screen File Name (PNG) Icon Size (pixels)
iPhone and iPod
Application Icon for iPhone (retina display) Icon@2x.png 114 x 114
Application Icon icon for iPhone Icon.png 57 x 57
Settings/Spotlight icon for iPhone (retina display) Icon-Small@2x.png 58 x 58
Settings/Spotlight icon for iPhone Icon-Small.png 29 x 29
Launch image Portrait (retina display) Default@2x.png 640 x 960
Launch image Portrait Default.png 320 x 480
iPhone 5
Launch image for iPhone 5 Portrait (retina display) Default-568h@2x.png 640 x 1136
iPad
Application Icon for the new iPad (retina display) Icon-72@2x.png 144 x 144
Application Icon for the iPad Icon-72.png 72 x 72
Settings/Spotlight icon for iPad Icon-Small-50@2x.png 100 x 100
Settings/Spotlight icon for iPad Icon-Small-50.png 50 x 50
Launch image Portrait (retina display) Default-Portrait@2x.png 1536 x 2008
Launch image Portrait Default-Portrait.png 768 x 1004
Launch image Landscape (retina display) Default-Landscape@2x.png 2048 x 1496
Launch image Landscape Default-Landscape.png 1024 x 748
iTunes App Store
App icon for the App Store (retina display) iTunesArtwork@2x.png 1024 x 1024
App icon for the App Store iTunesArtwork.png 512 x 512

Remember that with the transition from iOS 5 to iOS 6 is where the new iPhone 5, along with the iPod touch 5th generation.
These new Apple devices have only one big change in terms of the development of the App: The resolution of the screen. These devices have a large 4 “screen, WDVGA (Wide VGA Double) 640 × 1136 pixels, 326 DPI Retina display. These devices have the same width but more iPhone 4/4S height of 176 pixels in portrait mode.

App Icon Template

Point out again, as I have done in another article, this useful tool downloaded from the site “appicontemplate.com

By downloading the file you will get a PSD of the ‘icon of the app that, through Smart Objects in Photoshop, allows you to automate the process of exporting various dimensions of the icon.png file that must be included in the bundle of all iOS App

Through this model Photoshop we can only change the icon size bigger and it will automatically render that allow you to have icons of smaller size through a fast workflow.
This model has been created and maintained by the Danish designer Michael Flarup.

How to use (How to) App Icon Template?
The model works with Photoshop CS2 or later. Just open the PSD file with your version of Photoshop and then “right click” on the LAYER  called “SMART EDIT THIS OBJECT” (EDIT THIS SUBJECT SMART) and click on ‘Edit Contents’.
This will open the file Icon.psb and you can create your artwork in this canvas (painting). After saving the Icon.psb should be automatically rendered for the various sizes of main PSD file. And is possible to use the Actions (automated actions) of Photoshop that are bundled with the resource to export icon file versions squared and rounded corners.

Good Design!

 
 

Why video composition

You may think that video composition should be limited to applications like iMovie or Vimeo so you can consider this subject, at least from the point of view of the developer, to be limited to a niche of video experts. Instead it can be extended to a broader range of applications, not essentially limited to practical video editing. In this blog I will provide an overview of the AV Foundation framework applied on a practical example.

In my particular case the challenge was to build an application that, starting from a set of existing video clips, was able to build a story made by attaching a subset of these clips based on decisions taken by the user during the interaction with the app. The final play is a set of scenes, shot on different locations, that compose a story. Each scene consists of a prologue, a conclusion (epilogue) and a set of smaller clips that will be played by the app based on some user choices. If the choices are correct, then the user will be able to play the whole scene up to its happy end, but in case of mistakes the user will return to the initial prologue scene or to some intermediate scene. The diagram below shows a possible scheme of a typical scene: one prologue, a winning stream (green) a few branches (yellow are intermediate, red are losing branches) and an happy end. So the user somewhere in TRACK1 will be challenged to take a decision; if he/she is right then the game will continue with TRACK2, if not it will enter in the yellow TRACK4, and so on.

iPhone & iPad: Movie Game Storyboard
What I have in my hands is the full set of tracks, each track representing a specific subsection of a scene, and a storyboard which gives me the rules to be followed in order to build the final story. So the storyboard is made of the scenes, of the tracks the compose each scene and of the rules that establish the flow through these tracks. The main challenge for the developer is to put together these clips and play a specific video based on the current state of the storyboard, then advance to the next, select a new clip again and so on: all should be smooth and interruptions limited. Besides the user needs to take his decisions by interacting with the app and this can be done by overlapping the movie with some custom controls.

The AV Foundation Framework

Trying to reach the objectives explained in the previous paragraph using the standard Media Framework view controllers, MPMoviePlayerController and MPMoviePlayerViewController, would be impossible. These conrollers are good to play a movie and provide the system controls, with full-screen and device rotation support, but absolutely not for advanced controls. Since the release of iPhone 3GS the camera utility had some trimming and export capabilities, but these capabilities were not given to developers through public functions of the SDK. With the introduction of iOS 4 the activity done by Apple with the development of the iMovie app has given the developers a rich set of classes that allow full video manipulation. All these classes have been collected and exported in a single public framework, called AV Foundation. This framework exists since iOS 2.2, at that time it was dedicated to audio management with the well known AVAudioPlayer class, then it has been extended in iOS 3 with the AVAudioRecorder and AVAudioSession classes but the full set of features that allow advanced video capabilities took place only since iOS 4 and they were fully presented at WWDC 2010.

The position of AV Foundation in the iOS Frameworks stack is just below UIKit, behind the application layer, and immediately above the basic Core Services frameworks, in particular Core Media which is used by AF Foundation to import basic timing structures and functions needed for media management. In any case you can note the different position in the stack in comparison with the very high-level Media Player. This means that this kind of framework cannot offer a plug-and-play class for simple video playing but you will appreciate the high-level and modern concepts that are behind this framework, for sure we are not at the same level of older frameworks such as Core Audio.

(image source: from Apple iOS Developer Library)

Building blocks

The classes organization of AV Foundation is quite intuitive. The starting point and main building block is given by AVAsset. AVAsset represents a static media object and it is essentially an aggregate of tracks which are timed representation of a part of the media. All tracks are of uniform type, so we can have audio tracks, video tracks, subtitle tracks, and a complex asset can be made of more tracks of the same type, e.g. we can have multiple audio tracks. In most cases an asset is made of an audio and a video track. Note that AVAsset is an abstract class so it is unrelated to the physical representation of the media it represents; besides creating an AVAsset instance doesn’t mean that we have the whole media ready to be played, it is a pure abstract object.


There are two concrete asset classes available: AVURLAsset, to represent a media in a local file or in the network, and AVComposition (together with its mutable variant AVMutableComposition) for an asset composed by multiple media. To create an asset from a file we need to provide its file URL:

NSDictionary *optionsDictionary = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *myAsset = [AVURLAsset URLAssetWithURL:assetURL options:optionsDictionary];

The options dictionary can be nil, but for our purposes – that is making a movie composition – we need to calculate the duration exactly and provide random access to the media. This extra option, that is setting to YES the AVURLAssetPreferPreciseDurationAndTimingKey key, could require extra time during asset initialization, and this depends on the movie format. If this movie is in QuickTime or MPEG-4 then the file contains additional summary information that cancels this extra parsing time; but the are other formats, like MP3, where this information can be extracted only after media file decoding, in such case the initialization time is not negligible. This is a first recommendation we give to developers: please use the right file format depending on the application.
In our application we already know the characteristics of the movies we are using, but in a different kind of application, where you must do some editing from user imported movies, you may be interested in inspecting the asset properties. In such case we must remember the basic rule that initializing an asset doesn’t mean we loaded and decoded the whole asset in memory: this means that every property of the media file can be inspected but this could require some extra time. For completeness we simply introduce the way asset inspection can be done leaving the interested user to the reference documentation (see the suggested readings list at the end of this post). Basically each asset property can be inspected using an asynchronous protocol called AVAsynchronousKeyValueLoadingwhich defines two methods:

- (AVKeyValueStatus)statusOfValueForKey:(NSString *)key error:(NSError **)outError
- (void)loadValuesAsynchronouslyForKeys:(NSArray *)keys completionHandler:(void (^)(void))handler

The first method is synchronous and immediately returns the knowledge status of the specified value. E.g. you can ask for the status of “duration” and the method will return one of these possible statuses: loaded, loading, failed, unknown, cancelled. In the first case the key value is known and then the value can be immediately retrieved. In case the value is unknown it is appropriate to call the loadValuesAsynchronouslyForKeys:completionHandler: method which at the end of the operation will call the callback given in the completionHandlerblock, which in turn will query the status again for the appropriate action.

Video composition

As I said at the beginning, my storyboard is made by a set of scenes and each scene is composed by several clips whose playing order is not known a priori. Each scene behaves separately from the others so we’ll create a composition for each scene. When we get a set of assets, or tracks, and from them we build a composition all in all we are creating another asset. This is the reason why the AVComposition and AVMutableComposition classes are infact subclasses of the base AVAsset class.
You can add media content inside a mutable composition by simply selecting a segment of an asset, and adding it to a specific range of the new composition:

- (BOOL)insertTimeRange:(CMTimeRange)timeRange ofAsset:(AVAsset *)asset atTime:(CMTime)startTime error:(NSError **)outError

In our example we have a set of tracks and we want to add them one after the other in order to generate a continous set of clips. So the code can be simply written in this way:

 

    AVMutableComposition = [AVMutableComposition composition];
CMTime current = kCMTimeZero;
NSError *compositionError = nil;
for(AVAsset *asset in listOfMovies) {
BOOL result = [composition insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration])
ofAsset:asset
atTime:current
error:&compositionError];
if(!result) {
if(compositionError) {
// manage the composition error case
}
} else {
current = CMTimeAdd(current, [asset duration]);
}
}

First of all we introduced the time concept. Note that all media have a concept of time different than the usual. First of all time can move back and forth, besides the time rate can be higher or lower than 1x if you are playing the movie in slow motion or in fast forward. Besides it is considered more convenient to represent time not as floating point or integer number but as rational numbers. For such reason Core Media framework provides the CMTimestructure and a set of functions and macros that simplify the manipulation of these structures. So in order to build a specific time instance we do:

CMTime myTime = CMTimeMake(value,timescale);

which infact specifies a number of seconds given by value/timescale. The main reason for this choice is that movies are made of frames and frames are paced at a fixed ration per second. So for example if we have a clip which has been shot at 25 fps, then it would be convenient to represent the single frame interval as a CMTime variable set with value=1 and timescale=25, corresponding to 1/25th of second. 1 second will be given by a CMTime with value=25 and timescale=25, and so on (of course you can still work with pure seconds if you like, simply use the CMTimeMakeWithSeconds(seconds) function). So in the code above we initially set the current time to 0 seconds (kCMTimeZero) then start iterating on all of our movies which are assets in . Then we add each of these assets in the current position of our composition using their full range ([asset duration]). For every asset we move our composition head (current) for the length (in CMTime) of the asset. At this point our composition is made of the full set of tracks added in sequence. We can now play them.

Playing an asset

The AVFoundation framework doesn’t offer any built-in full player as we are used to see with MPMovieViewController. The engine that manages the playing state of an asset is provided by the AVPlayer class. This class takes care of all aspects related to playing an asset and essentially it is the only class in AV Foundation that interacts with the application view controllers to keep in sync the application logic with the playing status: this is relevant for the kind of application we are considering in this example, as the playback state may change during the movie execution based on specific user interactions in specific moments inside the movie. However we don’t have a direct relation between AVAsset and AVPlayer as their connection is mediated by another class called AVPlayerItem This class organizations has the pure purpose to separate the asset, considered as a static entity, from the player, purely dynamic, by providing an intermediate object, the that represent a specific presentation state for an asset. This means that to a given and unique asset we can associate multiple player items, all representing different states of the same asset and played by different players. So the flow in such case is from a given asset create a player item and then assign it to the final player.

AVPlayerItem *compositionPlayerItem = [AVPlayerItem playerItemWithAsset:composition];
AVPlayer *compositionPlayer = [AVPlayer playerWithPlayerItem:compositionPlayerItem];

 

In order to be rendered on screen we have to provide a view capable of rendering the current playing status. We already said that iOS doesn’t offer an on-the-shelf view for this purpose, but what it offers is a special CoreAnimation layer called AVPlayerLayer. Then you can insert this layer in your player view layer hierarchy or, as in the example below, use this layer as the base layer for this view. So the suggested approach in such case is to create a custom MovieViewer and set AVPlayerLayeras base layer class:

// MovieViewer.h

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
@interface MovieViewer : UIView {
}
@property (nonatomic, retain) AVPlayer *player;
@end

// MovieViewer.m

@implementation MovieViewer
+ (Class)layerClass {
return [AVPlayerLayer class];
}
- (AVPlayer*)player {
return [(AVPlayerLayer *)[self layer] player];
}
- (void)setPlayer:(AVPlayer *)player {
[(AVPlayerLayer *)[self layer] setPlayer:player];
}
@end

// Intantiating MovieViewer in the scene view controller
// We suppose “viewer” has been loaded from a nib file
// MovieViewer *viewer
[viewer setPlayer:compositionPlayer];

At this point we can play the movie, which is quite simple:

[[view player] play];
Observing playback status

It is relevant for our application to monitor the status of the playback and to observe some particular timed events occurring during the playback.
As far as status monitoring, you will follow the standard KVO based approach by observing changes in the status property of the player:

// inside the SceneViewController.m class we’ll register to player status changes
[viewer.player addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:NULL];

// and then we implement the observation callback
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
    if(object==viewer.player) {
        AVPlayer *player = (AVPlayer *)object;
        if(player.status==AVPlayerStatusFailed) {
      // manage failure
        } else if(playe.status==AVPlayerStatusReadyToPlay) {
      // player ready: manage success state (e.g. by playing the movie)
        } else if(player.status==AVPlayerStatusUnknown) {
      // the player is still not ready: manage this waiting status
        }
    }
}

Differently from the KVO-observable properties timed-events observation is not based on KVO: the reason for this is that the player head moves continuously and usually playing is done on a dedicated thread. So the system certainly prefers to send its notifications through a dedicated channel, that in such case consists in a block-based callback that we can register to track such events. We have two ways to observe timed events:

  • registering for periodic intervals notifications
  • registering when particular times are traversed

In both methods the user will be able to specify a serial queue where the callbacks will be dispatched to (and it defaults to the main queue) and of course the callblack block. It is relevant to note the serial behaviour of the queue: this means that all events will be queued and executed one by one; for frequent events you must ensure that these blocks are executed fast enough to allow the queue to process the next blocks and this is especially true if you’re executing the block in the main thread, to avoid the application to become unresponsive. Don’t forget to schedule this block to be run in the main thread if you update the UI.
Registration to periodic intervals is done in this way, where we ask for a 1 second callback whose main purpose will be to refresh the UI (typically updating a progress bar and the current playback time):

// somewhere inside SceneController.m
id periodicObserver = [viewer.player addPeriodicTimeObserverForInterval:CMTimeMakeWithSeconds(1.0) queue:NULL usingBlock:^(CMTime time){
[viewer updateUI];
}];
[periodicObserver retain];

// and in the clean up method
-(void)cleanUp {
[viewer.player removeTimeObserver:periodicObserver];
[periodicObserver release];
}

// inside MovieViewer.m
-(void)updateUI {
// do other stuff here
// …
// we calculate the playback progress ratio by dividing current position of playhead into the total movie duration
float progress = CMTimeGetSeconds(player.currentTime)/CMTimeGetSeconds(player.currentItem.duration);
// then we update the movie viewer progress bar
[progressBar setProgress:progress];
}

 

Registration to timed events is done using a similar method which takes as argument a list of NSValue representations of CMTime (AVFoundation provides a NSValue category that adds CMTime support to NSValue):

// somewhere inside SceneController.m
id boundaryObserver = [viewer.player addBoundaryTimeObserverForTimes:timedEvents queue:NULL usingBlock:^{
[viewer processTimedEvent];
}];
[boundaryObserver retain];
// inside MovieViewer.m
-(void)processTimedEvent {
// do something in the UI
}
In both cases we need to unregister and deallocate somewhere in our scene controller the two observer opaque objects; we may suppose the existence of a cleanup method that will be assigned this task:
-(void)cleanUp {
[viewer.player removeTimeObserver:periodicObserver];
[periodicObserver release];
[viewer.player removeTimeObserver:boundaryObserver];
[boundaryObserver release];
}

While this code is the general way to call an event, in our application it is more appropriate to assign to each event a specific action, that is we need to customize each handling block. Looking at the picture below, you may see that at specific timed intervals inside each of our clips we assigned a specific event.


The figure is quite complex and not all relationships have been highlighted. Essentially what you can see is the “winning” sequence made of all green blocks: they have been placed consecutively in order to avoid the playhead jumping to different segments when the player takes the right decisions, so the playback will continue without interruption and will be smooth. With the exception of the prologue track, which is just a prologue of the history and no user interaction is required at the stage, and is corresponding conclusion, simply an epilogue when the user is invited to go to the next scene, all other tracks have been marked by a few timed events, identified with the dashed red vertical lines. Essentially we have identified 4 kind of events:

  • segment (clip) starting point: this will be used as a destination point for the playhead in case of jump;
  • show controls: all user controls will be displayed on screen, user intercation is expected;
  • hide controls: all user controls are hidden, and no more user interaction is allowed;
  • decision point, usually coincident with the hide controls event: the controller must decide which movie segment must be played based on the user decision.

Note that this approach is quite flexible and in theory you can any kind of event, this depends on the fantasy of the game designers. From the point of view of the code, we infact subclassed the AVURLAsset by adding an array of timed events definitions. At the time of the composition creation, this events will be re-timed according to the new time base (e.g.: if an event is played at 0:35 seconds of a clip, but the starting point of the clip is exactly at 1:45 of the entire sequence, the the event must be re-timed to 1:45 + 0:35 = 2:20). At this point, with the full list of events we can re-write our boundary registration:

// events is the array of all re-timed events in the complete composition
__block __typeof__(self) _self = self; // avoids retain cycle on self when used inside the block
[events enumerateObjectsUsingBlock:^(id obj, NSUInteger idx, BOOL *stop) {
TimedEvent *ev = (TimedEvent *)obj;
[viewer.player addBoundaryTimeObserverForTimes:[NSArray arrayWithObject:[NSValue valueWithCMTime:ev.time]]
queue:dispatch_get_main_queue()
usingBlock:^{
// send event to interactiveView
[viewer performTimedEvent:ev];
[_self performTimedEvent:ev];
}];
}];

 

 

As you can see the code is quite simple: for each timed event we register a single boundary which simply calls two methods, one for the movie viewer and one for the scene controller; in both cases we send the specific event so the receiver will know exactly what to do. The viewer will normally take care of UI interaction (it will overlay a few controls on top of the player layer, so according to the events these controls will be shown or hidden; besides the viewer knows which control has been selected by the user) while the scene controller will manage the game logic, especially in the case of the decision events. When the controller finds a decision event, it must move the playhead to the right position in the composition:

 

 CMTime goToTime = # determines the starting time of the next segment #
[viewer hide];
[viewer.player seekToTime:goToTime toleranceBefore:kCMTimeZero toleranceAfter:kCMTimePositiveInfinity completionHandler:^(BOOL finished) {
if(finished) {
dispatch_async(dispatch_get_main_queue(), ^{
[viewer show];
});
);
}];

 

What happens in the code above is that in case we need to move the playhead to a specific time, we first determine this time then we ask the AVPlayer instance to seek to this time by trying to move the head in this position or after with some tolerance (kCMTimePositiveInfinity) but not before (kCMTimeZero in the toleranceBefore: parameter; we need this because the composition is made of all consecutive clips and then moving the playhead before the starting time of our clip could show a small portion of the previous clip). Note that this operation is not immediate and even if quite faster it could take about one second. What happens during this transition is that the player layer will show a still frame somewhere in the destination time region, than will start decoding the full clip and will resume playback starting from another frame, usually different than the still one. The final effect is not really good and after a few experimentation a decided to hide the player layer immediately before starting seeking and showing it again as soon the player class informs me (through the completionHandler callback block) that the movie is ready to be played again.

Conclusions and references

I hope this long post will push other developers to start working on interactive movie apps that will try to leverage the advanced video editing capabilities of iOS other than for video editing. The AVFoundation framework offers us very powerful tools and which are not difficult to use. In this post I didn’t explore some more advanced classes, such as AVVideoComposition and AVSynchronizedLayer. The former is used to create transitions, the latter is use to synchronize core animation effects with the internal media timing.

Great references on the subject can be found in the iOS Developer Library or WWDC videos and sample code:

  • For a general overview: AVFoundation Programming Guide in the iOS Developer Library
  • For the framework classes documentation: AVFoundation Framework Reference in the iOS Developer Library
  • Video: Session 405 – Discovering AV Foundation from WWDC 2010, available in iTunesU to registered developers
  • Video: Session 407 – Editing Media with AV Foundation from WWDC 2010, available in iTunesU to registered developers
  • Video: Session 405 – Exploring AV Foundation from WWDC 2010, available in iTunesU to registered developers
  • Video: Session 415 – Working with Media in AV Foundation from WWDC 2011, available in iTunesU to registered developers
  • Sample code: AVPlayDemo from WWDC 2010 sample code repository
  • Sample code: AVEditDemo from WWDC 2010 sample code repository

 

Writed by Carlo Vigiani