i3Factory World

Your Iphone, iPad & Android Application Factory

Browsing Posts in Development

App_download_strategyMany customers, having launched its own smartphone application, ask us how to do to encourage people to download your App.
The process of developing an app can be tricky but once it is completed it should be ready for another phase of work and use energy and resources to focus on acquiring new users.
However, make sure that users will find your app is not simple, as in the early years of the birth of the App store for a large number of downloads was enough to propose an application with a name or special features, now with hundreds of thousands of applications on the market can be really complicated to get your app a really high number of downloads, and above all make it known to all those users who are willing to use it and download it. In this article I have collected from the network the first seven strategies that can easily be put in place to search for new users and convince them to download and use your new mobile application.

 

1. Optimize your description on the App Store
One of the main drivers of new users is the Apple App Store and / or Google Play Store, depending on which platform you have chosen for your application, is just what you put in the field “App Description”.Amazon_app_store_descriptionA description of the app should be well thought out , we have to produce a detailed text and seductive at the same time so that I can be able to convince users that it is worth to take a look and try your app.app_icon_store_qualityIt is also important to have well-designed icons and all images screenshots high resolution so that you see good in all the different smartphone models for which the app is designed. If you are planning to launch the app in other countries, you’ll also get all the professionally translated descriptions so that we can ensure that users do not run away via a foreign language because of a copy / paste badly translated by google translator. This simple rule is always to be taken into account because many users have the time and desire to take a look at the description and view the screenshot. It is therefore usual rule: a good first impression can prevent a user from switching to another before you decide to download your application.

2. Create a Website Promotion and Marketing Content
Developing a traditional website is a great way to market your app, app store for some securities it turns out that the Web is often the primary driver of new users. A website containing a well thought out blog section to inform users about new products and the characteristics of ‘application, where you can find questions and answers and a section that will show you how best to use the app. Do not forget to create a download link clearly visible, using the badge of the app store (see figure below), from the site for the installation of your application, these badges encourage users to want to download the app and play with it.

 

Badge per il download da App Store

Badge per App Store

The goal is always to encourage the user to install the application, the content marketing, the so-called Content Marketing should be focused to instruct users to press the button Download. Note: You must always make sure that your site is formatted to work correctly on screens of smartphones, or that it is a site called “responsive”. Many entrepreneurs make the mistake of having websites you read well on laptops and desktops, but are a disaster on a mobile device. So if you are thinking of marketing a mobile app you can not help but have a website “mobile-ready! “

 

 3. Promote Your Mobile App through your existing marketing channels
mailing_list It ‘s always very useful to send an e-mail to let us know your new application, it is also helpful to advise potential users as it can be installed. So if you have a database of email addresses of customers or users do not keep it in the drawer. What do you say messaging via SMS? The traditional mail? The road signs? Regardless of where you are in the world, many of your current and future customers have the smartphone and will be willing to test your application to see if it is something that they want to use regularly. Do not be afraid to let them know that your business is “mobile” and then you can interact with your brand in a unique way through your application.

4. Purchase Advertising “Install App” and other ads
Announcements to install the app, especially if purchased from major advertisers such as Facebook, Google etc. has become one of the best ways to get potential new users to your application. These allow advertisers to target users based on demographic information and can help to ensure that your marketing budget is spent only on a good target.
creativita_app_sviluppo_marketing_applicazioniAd example, if you have a business that sells primarily to female university students can purchase Facebook ads that target women between 18 and 22 attending colleges and universities specifications. If your application can be used by anyone in a particular country, it is possible to cover the population with ads to install the app in order to try to get the largest number of users that can install your application. Note that this rule generally applies: the number of users that can be acquired through targeted ads is limited only by your budget.

 5. Promo Codes and other freebies
The promotional campaigns that offer some type of discount or free download is another great way to gain new installations. That is almost always worth thinking about for a coupon code that offers a few free if you download the application, or to enter a coupon code. It’s always worth doing a couple of tests to find out which offers free lead most of the download and then focus on those. A side benefit of this method is that users are very likely to share a coupon code free with their friends. You may find that if you earn something more with the ‘word of mouth’ among users if this behavior is stimulated by your offer.

6. Submit your App to sites Reviews and Opinion
This activity takes a bit of time, but if you can spend some time in order to submit your app to the various app review sites and blogs that populate the Internet, this time will give you some satisfaction. The publishers of these sites will be encouraged to download your application and write a review of how it works and what users can expect from it. Note that if you decide to market your application in this way you will have to do a good test on the aesthetics of the app and make sure you have a high quality design and a very good user interface. Critics can be a very hard, if your application is not made and designed to rule, Art. Ricodate always that the quality of the user interface and the design does not depend at all on the cost that you paid to your chart or your marketing agency, geese quell matters is the actual hands-on experience of the designer. In the world of apps, the rules of the free market and do not count for anything other than the quality.

Here are links to some of the most popular review sites:

Mac World
TechCrunch
ZDNet
Cnte Download.com
Mashable

7. Leveraging the Power of Social Media
The last tip of this short article is to make the most of the power of social media to gain new users and other downloads. Posting on Facebook, communicate with your followers on Twitter, and if you have a brand it’s time to consider opening an account Instagram or Pinterest. Being active on social media is a great way to show that there is a personality or a team behind your applicati.

 

Conclusions

These 7 simple rules are fairly obvious and easy to intuition, but although everything seems so ‘clear to those who work with i3Factory, more and more often we see a waste of money and energy to those who still do not want to look ka real nature of this huge market app. I often talk with Graphics and Marketing Agencies that make spending budget to its customers with a presumption of fact as to make difficult, and painful, by the customer realize the fact that the “world has changed completely” with the ‘was the web, and that further big change came with the era of the “App Store”.
It happens so often to make several “gaffe” in meetings in which the computer appears to be driven by “Ptolemaic” and not enough to give some good advice to avoid many can throw a ton of money.

 

iOS_ZIP-KIT_ZipKit_Framework_Xcode_tutorialOne of the i3Factory developer is working on a project of sending paths attaching photos, text, and audio / video which needs to import a framework library for zipping files.
He found that this library seems to be constantly updated (usually found only those “no arc” for iOS4 ..)
 https://github.com/kolpanic/ZipKit .

In ZipKit is present a brief guide to the installation, but Claudio, our Italian developer, finds it extremely superficial and not riescito to import anything (that is ZipKit.framework libtouchzipkit.a remain the “reds“). Inside Zip Kits are also linked apps that should be an example, but unloading the one for iOS and trying to start it now Build Failed”. Our developer is still inexperienced on how to integrate projects on other projects and setting dependencies.

Let us then explain how the whole mechanism, what needs to touch, what not, etc.

First of all, given that the project is for iOS is necessary to exclude the Framework (the Framework exist only for OSX and have been introduced in the last WWDC only iOS8 and now to create a Framework iOS7 must follow a fairly intricate process explained here: http://www.raywenderlich.com/65964/create-a-framework-for-ios).
So the static library of interest and the libtouchzipkit.a (and the otherthe static library for OSX).
Our CTO, Carlo Vigiani, proceeded in this way and was able to compile a simple project (ZKtest da scaricare qui that uses the class ZKFileArchive (we did not try the result, but I assume that it works).

Hoping that this explanation will be helpful, import projects is not always obvious:
1) Open the project containing the application
2) To clone from github the whole project ZipKit and copy the entire project folder within your project (this step is not necessary but recommended so you’re sure to have the version of ZipKit right for your project ).
ZipKit_Test-0
3) from the Finder, drag the file into the folder ZipKit.xcodeproj of the project (see figure: Charles has created an app ZKTest)

ZipKit_Test-1
4) At this point you have to end up in the target Build Phases of your app (in the figure ZKTest) libraries that you will use: here I added libz.dylib (system) and libtouchzkpkit.a (given that you have to see that you’ve uploaded the project ZipKit as a sub-project of the main project); see Figure

5) also so that you can see the headers you need to add the header path is the path where ZipKit. If in step (2) avevevato copied the folder under the project ZipKit inside the main project folder, you only need to add. / ZipKit. In our case given that Charles did not copy the project must discover the absolute or relative path with respect to its project. Typically Charles goes like this:
5.1) in XCode select the sub-project (in this case ZipKit)

ZipKit_Test-35.2) nell’assistant XCode recover the path of the sub-project to the main project (see figure): in this case was .. /.. / Frameworks / Date / ZipKit / ZipKit.xcodeproj

5.3) now. H file is the main ZipKit.h that is located just below the directory you just searched, and then we go to our main target, we select Build Settings and Header Search Path and add the path previously found (see figure)

ZipKit_Test-4

5.4) also we look for the section “Other Linker Flags” and add -ObjC-all_load” (see figure)

At this point the project (ZKtest download it here) and ready. Where we need to import the file “ZipKit.h” (note that we have indicated ZipKit / ZipKit.h given that the header file is located two levels below ZipKit header path than we had indicated before ../../Frameworks/Data/ZipKit/ZipKit/ZipKit.h

#import “ZipKit/ZipKit.h”
In general you have to keep these things in mind when you import a project:
  • headers needed for the compiler to check the correctness of your code and generate the binary code that calls functions; the header search path by default points to the project and the framework of system, so it’s up to you to manually add additional header search path
  • even if you have entered the correct header, then the linker will have to find the binary code associated with the functions you referenced: iOS, since there are no frameworks (at least until iOS7), you must load the static libraries (which are then copied to the application and statically loaded along with the rest of the track of the app): you can specify the static libraries in the Build Phases -> Link binary with libraries
  • You can optionally add in the “Target Dependenciestarget touchzipkit”: this ensures that the compilation of your code should always be preceded by the compilation of the original library: this and useful if you make changes to the event, in this case forced the generation of the target . Probably you will not do this if ZipKit compiled separately and then do not touch it anymore. But realize that this issuggested in case you have decided to support several architectures or change SDK (compilation assures you that there is no incompatibility). This of course should not be done if you provide a library without source code (for example, Urban Airship or Google Analytics does not give you the source code and therefore do not need to add the project in the target dependencies).

 

IOS_OSX_maps_OFFLINE

New MapKit features

In WWDC 2013 Apple introduced many new MapKit features for iOS 7 and added this framework in OSX 10.9 (Mavericks). One of the major changes, which in my opinion didn’t get enough relevance in the developer community, has been the introduction of some base classes that allow full map customization and support for offline maps. In this article I’m going to describe the new MKTileOverlay class and present an example, for both iOS and OSX, that demonstrates the new capabilities.

Since the earlier iPhone OS versions there were many apps in the App Store that were supporting maps different from the ones provided by the operating system: consider for example Navigation apps that required support for offline navigation, that is the possibility to see the map even without internet connection. Another requirements for some special kind of applications that needed to show proprietary information (such as “Yellow Pages” apps) or technical information (e.g. when there was the requirement to show level curves for mountains or to represent the sea level).

There were several issues due to this limitation: first of all the overall mapping experience was completely different of these different approaches each other and in most cases they were subpar if compared with the OS maps performance (either with Google or Apple data). Besides from the point of view of the developer there were the problem of providing the right mapping code to support the map provider data: there were no a unique solution, but many. Some were commercial and expensive, other were open source but with lack of support and finally there were a lot of web-browser based solutions whose performances were far from the native maps other than difficult to integrate with Objective-C.

What we’re going to show in this article is how these things changed drastically and how it is easy to integrate your own map content inside the common MapKit framework.

Map overlays

At the base of our discussion there is the concept of “map overlay”. This is not new in MapKit, but with iOS7 things changed. Overlays are essentially parts of a map that can be overlayed over the base map, that is the part of the map representing the ground, the borders, the roads, and so on. Typically the usage of overlayed is to emphasize some regions of the map having a common property: e.g. to highlight a specific country or to represent the several intensities of an earthquake that occurred in a certain area or finally to highlight a road path in a navigation app.

From the point of view of the developer, an overlay is any object that conforms to the MKOverlay protocol. This protocol defines the minimum properties and methods required to define an overlay: their are the approximate center coordinate and the bounding box the fully encloses the overlay (boundingMapRect). These two properties allow MapKit to determine if a specific overlay is currently visible or not in the map so that the framework can take the actions needed to display this overlay. When an overlay object is added to the map using one of the MKMapView’s addOverlay: methods the control passes to the framework which, when determines that a specific overlay needs to be displayed, calls the map view delegate asking him to provide the graphical representation of the overlay. Before iOS7 Apple was providing a set of concrete MKOverlay compatible classes and they were associated to their corresponding MKOverlayView. E.g. to represent a circular overlay we could use the built-in MKCircle class and then provide, for rendering, the associated MKCircleView class, without the need to define our own object.

With iOS7 things changed: now the MKOverlayView has been replaced by the MKOverlayRenderer. Even if this changesdoesn’t require difficult refactoring to translate the code from pre-iOS7 to iOS7, thanks to the fact that Apple did a 1:1 mapping of methods from the old class to the new class, conceptually the change is significant: now the graphical representation of the overlay is no more provided by a UIView subclass, which is typically considered a heavy class, but it is provided by a class, MKOverlayRenderer, which is much more lightweight and descends directly from NSObject. However the mapping between the old and new class is complete, so in the circle example we can see MKCircleView replaced by MKCircleRenderer.

Finally overlays are stacked on the map, so they are given a Z-index that provides the relative positions of overlays each other and with the fixed parts of the map. Before iOS7 you could stack this overlays and define their positions as in an array, with iOS7 two stacks are defined in fact, and they are called “levels”: one level is the “above the roads level”, the other level is the “above the labels level”. This is an important and useful distinction because now we can change how the overlay rendering interacts with the map by specifying if it lies above or below the labels.

Tile overlays

Whatever is the complexity and size of the overlay, we have seen them up to now overlays as specific shapes. With the new MapKit provided with iOS 7 and OSX Mavericks, there is a new type of overlay called tiled overlay. You may consider this type of overlay as a particular layer the covers the whole map: due to its large dimensions this overlay is tiled, that is it is partitioned in bitmap areas to reduce the memory required to show the data and make the overlay rendering efficient. The purpose of this concrete implementation of the MKOverlay protocol, called MKTileOverlay (together with its rendering counterpart given by the MKTileOverlayRenderer class), is to efficiently represent the whole set of tiles across the map plane and for different zoom levels. This last point is important: when you’re displaying a map using bitmap drawing (to be compared with vector drawing) you can get an efficient implementation only if the specific bitmap representing a part an area of the map has the right details suitable for the current zoom level. This means that if we show the full Europe map we don’t need to present road and cities should be represented as points and only for the major ones; as soon as we zoom in in a specific area then we cannot continue to represent the area by scaling the same tile, because it doesn’t contain the required information and also because we would see evident scaling effects. The solution to this is to divide the continuous allowed zoom range in discrete levels and for each level provide the required set of tiles that will show the details appropriate for that levels. It is evident that if we keep the single bitmap tile size constant (e.g. 256 x 256 pixels) then for each zoom level we must increse the number of tiles by a factor of 4: you can see this in the picture below: the single european tile at zoom level 3, when zoom to zoom level 4 has been split, and furtherly details, with four new different tiles having all the same size of the original tile.

 

map_tiles_offline-mapping

URL templates

The tiled overlay class works efficiently as it does a lazy loading of the tiles: this means that a bitmap tile is looked for and loaded only when it needs to be displayed. In order to know the location of the tile, the developer must define in the tile overlay definition the so-called URL template. This is a string representing a template for the final URL that will be used to retrieve the tile: this template will contain some placeholders that will be replaced by effective values to get the final URL. Each tile can be characterized by 4 parameters: x and y for the tile indexes in the map plane, z for the zoom level and finally scale for the bitmap image resolution (scale factor). The corresponding placeholders for these parameters are: {x} {y} {z} {scale}. So as an example, the OpenStreetMap template URL will be http://c.tile.openstreetmap.org/{z}/{x}/{y}.png and then the tile with index X=547 Y=380 and zoom level Z=10, that fully encloses the city of Rome, will be represented by the URL: http://c.tile.openstreetmap.org/10/547/380.png (see below the image taken from our OSX demo app).

iOS_offline_map_rome_tile

Note that a URL template can be an http:// template to retrieve tiles from the internet, but it could also be a file:// template if we want to retrieve files from the disk: in this way we can save our tiles in the application bundle, or download and install a full tiles package for a certain city, and then display maps even if the device is not connected to the internet.

The mechanism that is used by the framework to translate a required tile coordinate (x;y;z;scale) to an effective bitmap is composed of several steps: this gives the developer the possibility to hook its own code to effectively customize the way the tiles are generated. This can be done by subclassing MKOverlayTile. Note that this is not required if setting the URL template is enough for you.

When the map framework needs a specific map tile, it calls the loadTileAtPath:result: of the MKOverlayTile class (or subclass):

1
- (void)loadTileAtPath:(MKTileOverlayPath)path result:(void (^)(NSData *tileData, NSError *error))result;

The first method argument is called path and is a MKTileOverlayPath structure which contains the tile coordinates:

1
2
3
4
5
6
typedef struct {
	NSInteger x;
	NSInteger y;
	NSInteger z;
	CGFloat contentScaleFactor; // The screen scale that the tile will be shown on. Either 1.0 or 2.0.
} MKTileOverlayPath;

The second method argument is a completion block that needs to be called when the tile data has been retrieved: this completion block will be called by passing the data and an error object. The MKTileOverlay default implementation will call the -URLForTilePath method to retrieve the URL and then it will use NSURLConnection to load the tile data asynchronously.

If we want to customize the tile loading behaviour we can easily subclass MKTileOverlay and redefine the loadTileAtPath:result: with our implementation of the loading mechanism. E.g. we can implement our own tiles caching mechanism (other than the one provided by the system via NSURLConnection) to return the cached data before triggering the network call; or we could watermark the default tile if we are shipping a freemium version of our offline map.

A more light way to hook into the tile loading mechanism is to redefine in our subclass the -URLForTilePath: method:

1
- (NSURL *)URLForTilePath:(MKTileOverlayPath)path;

The purpose of this method is to return the URL given the tile path. The default implementation is just to fill-out the URL template, as specified above. You need to redefine this method if the URL template mechanism is not sufficient for your needs. A typical case is when you want to pass in the URL a sort of “device identifier” to validate the eligibility of that specific app to access the URL (e.g. if you provide a limit to the quantity of data that can be accessed by a user on a given time or if you want to charge for this data), another case if you have multiple tile servers and you want to do a sort of “in-app” load balancing or regional-based API access (e.g. you have servers in multiple locations and based on the effective device location you want to access the closer server).

The tile renderer

As all overlays are associated to a renderer, also the tile overlay has its concrete renderer class: MKTileOverlayRenderer. Normally you don’t need to subclass this renderer so your map delegate’s -mapView:rendererForOverlay: method can simply instantiate the default tile overlay renderer initialized with your default or subclassed tile overlay instance. Possible applcations of a custom overlay renderer are when you need to further manipulate the bitmap image, e.g. adding a watermark or applying a filter, and this manipulation is independent from the tile source. In the demo code I defined a custom renderer to be used specifically for the Google map, whose effect is to add a sort of colored translucent mosaic on top of the map tiles.

The demo code

You can get the demo code from Github. This code works on both iOS 7 and OSX 10.9 and its purpose is to present a map and give the user the possibility to switch between different tile set: Apple (system), Google, OpenStreetMap and offline from a subset of OpenStreetMap tiles bundled within the app. In all cases I applied an extra overlay layer to show the tile grid with the x,y,z path associated to each grid. (Note: in OSX if you don’t code sign the app using your OSX Developer Program certificate, you will not be able to see the Apple tiles: the other three tile sets will be visible instead). You will see how you can fully take advantage of all features common to the MapKit (zoom, rotation, pan, custom overlays and also annotations which I didn’t include in the demo) and the only difference is in the tiles source and how they are rendered.

 

map-offline-tiles-demo_code

As you can see in the demo apps, there is a main view controller (iOS) and window controller (OSX). In both cases the main view contains an instance of MKMapView and a segmented control to switch between different visualizations. On the map I have instantiated two overlays. The first one is the grid overlay:

1
2
3
4
 // load grid tile overlay
 self.gridOverlay = [[GridTileOverlay alloc] init];
 self.gridOverlay.canReplaceMapContent=NO;
 [self.mapView addOverlay:self.gridOverlay level:MKOverlayLevelAboveLabels];

This is a tile overlay of subclass GridTileOverlay. It will not replace the map content (this means that is effectively overlayed on the map content) and its purpose is to draw, just above labels, the tiles grid.

The reloadOverlay method is called each time the overlay type selector is changed or when the view is loaded. It removes any existing tileOverlay and replaces it with a new one:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
-(void)reloadTileOverlay {

	// remove existing map tile overlay
	if(self.tileOverlay) {
		[self.mapView removeOverlay:self.tileOverlay];
	}

	// define overlay
	if(self.overlayType==CustomMapTileOverlayTypeApple) {
		// do nothing
		self.tileOverlay = nil;
	} else if(self.overlayType==CustomMapTileOverlayTypeOpenStreet || self.overlayType==CustomMapTileOverlayTypeGoogle) {
		// use online overlay
		NSString *urlTemplate = nil;
		if(self.overlayType==CustomMapTileOverlayTypeOpenStreet) {
			urlTemplate = @"http://c.tile.openstreetmap.org/{z}/{x}/{y}.png";
		} else {
			urlTemplate = @"http://mt0.google.com/vt/x={x}&y={y}&z={z}";
		}
		self.tileOverlay = [[MKTileOverlay alloc] initWithURLTemplate:urlTemplate];
		self.tileOverlay.canReplaceMapContent=YES;
		[self.mapView insertOverlay:self.tileOverlay belowOverlay:self.gridOverlay];
	}
	else if(self.overlayType==CustomMapTileOverlayTypeOffline) {
		NSString *baseURL = [[[NSBundle mainBundle] bundleURL] absoluteString];
		NSString *urlTemplate = [baseURL stringByAppendingString:@"/tiles/{z}/{x}/{y}.png"];
		self.tileOverlay = [[MKTileOverlay alloc] initWithURLTemplate:urlTemplate];
		self.tileOverlay.canReplaceMapContent=YES;
		[self.mapView insertOverlay:self.tileOverlay belowOverlay:self.gridOverlay];
	}
}

In the Apple maps case no extra overlay is added of course: we just use the base map. When we select to view the Google and OpenStreetMap we will use a standard MKTileOverlay class with the appropriate URL template. In both cases the overlay will be added with the canReplaceMapContent property set to YES: this replaces the Apple base maps completely and will avoid that those data will be loaded. Note that we add the tileOverlay just below the gridOverlay. Finally the offline case still uses a base overlay class but with a file URL template: note that we create the path from a hierarchical directory structure build inside the bundle. In this case too the new tiles replace the base ones and are inserted below the grid.

Our controller, being a delegate of MKMapView, responds to the -mapView:rendererForOverlay:. This is required by every application that uses overlays as this is the point where the app effectively tells the system how to draw an overlay that is currently visible in the map. In our case we just check that the overlay is a tile overlay (this is a general case to consider that fact that we might have other types of overlays) and based on the selection we use the standard MKTileOverlayRenderer or a custom renderer WatermarkTileOverlayRenderer. The latter is used to apply a randomly colored semi-transparent effect on top of the tiles, getting as a result a vitreous mosaic effect.

Conclusions

The possibility to easily switch between different map types but keeping the same “map navigation experience” is one of the most revolutionary features introduced with iOS 7, other than the longly awaited introduction of native maps inside OSX. This provides the same map infrastructure whatever is the content. Obviously the generation of custom map content is another huge and highly specialized task that we cannot cover here: but for developers this is a great step forward.

References

  • Location and Maps Programming Guide from Apple Developer Library
  • WWDC 2013 session 304 video: What’s new in Map Kit from Apple WWDC 2013 videos
  • MBXMapKit GitHub project by Mapbox – A simple library to intergrate Mapbox maps on top of MapKit, one of the first applications of tiled overlays

“viggiosoft github”

  • The GDAL project one of the main references for custom maps creation. Here is a link to a compiled version of the GDAL OSX Framework
  • Maperitive another great tool (for Windows only) to create custom maps and prepare them for offline usage
Posted by Carlo Vgiani

We often have to update our application with the necessary high-resolution images needed for the new iPad (iPad 3 or iPad 4). Fortunately, the new iPad Mini has maintained the same resolution as the first iPad is 1024 × 768 pixels.
Since it is not always easy to find the documents ufficali Apple, in this article I have again gathered all the information we need to update the icons, the intro, and so on.

First of all let’s start with this handy table:

Device/Screen File Name (PNG) Icon Size (pixels)
iPhone and iPod
Application Icon for iPhone (retina display) Icon@2x.png 114 x 114
Application Icon icon for iPhone Icon.png 57 x 57
Settings/Spotlight icon for iPhone (retina display) Icon-Small@2x.png 58 x 58
Settings/Spotlight icon for iPhone Icon-Small.png 29 x 29
Launch image Portrait (retina display) Default@2x.png 640 x 960
Launch image Portrait Default.png 320 x 480
iPhone 5
Launch image for iPhone 5 Portrait (retina display) Default-568h@2x.png 640 x 1136
iPad
Application Icon for the new iPad (retina display) Icon-72@2x.png 144 x 144
Application Icon for the iPad Icon-72.png 72 x 72
Settings/Spotlight icon for iPad Icon-Small-50@2x.png 100 x 100
Settings/Spotlight icon for iPad Icon-Small-50.png 50 x 50
Launch image Portrait (retina display) Default-Portrait@2x.png 1536 x 2008
Launch image Portrait Default-Portrait.png 768 x 1004
Launch image Landscape (retina display) Default-Landscape@2x.png 2048 x 1496
Launch image Landscape Default-Landscape.png 1024 x 748
iTunes App Store
App icon for the App Store (retina display) iTunesArtwork@2x.png 1024 x 1024
App icon for the App Store iTunesArtwork.png 512 x 512

Remember that with the transition from iOS 5 to iOS 6 is where the new iPhone 5, along with the iPod touch 5th generation.
These new Apple devices have only one big change in terms of the development of the App: The resolution of the screen. These devices have a large 4 “screen, WDVGA (Wide VGA Double) 640 × 1136 pixels, 326 DPI Retina display. These devices have the same width but more iPhone 4/4S height of 176 pixels in portrait mode.

App Icon Template

Point out again, as I have done in another article, this useful tool downloaded from the site “appicontemplate.com

By downloading the file you will get a PSD of the ‘icon of the app that, through Smart Objects in Photoshop, allows you to automate the process of exporting various dimensions of the icon.png file that must be included in the bundle of all iOS App

Through this model Photoshop we can only change the icon size bigger and it will automatically render that allow you to have icons of smaller size through a fast workflow.
This model has been created and maintained by the Danish designer Michael Flarup.

How to use (How to) App Icon Template?
The model works with Photoshop CS2 or later. Just open the PSD file with your version of Photoshop and then “right click” on the LAYER  called “SMART EDIT THIS OBJECT” (EDIT THIS SUBJECT SMART) and click on ‘Edit Contents’.
This will open the file Icon.psb and you can create your artwork in this canvas (painting). After saving the Icon.psb should be automatically rendered for the various sizes of main PSD file. And is possible to use the Actions (automated actions) of Photoshop that are bundled with the resource to export icon file versions squared and rounded corners.

Good Design!

Why video composition

You may think that video composition should be limited to applications like iMovie or Vimeo so you can consider this subject, at least from the point of view of the developer, to be limited to a niche of video experts. Instead it can be extended to a broader range of applications, not essentially limited to practical video editing. In this blog I will provide an overview of the AV Foundation framework applied on a practical example.

In my particular case the challenge was to build an application that, starting from a set of existing video clips, was able to build a story made by attaching a subset of these clips based on decisions taken by the user during the interaction with the app. The final play is a set of scenes, shot on different locations, that compose a story. Each scene consists of a prologue, a conclusion (epilogue) and a set of smaller clips that will be played by the app based on some user choices. If the choices are correct, then the user will be able to play the whole scene up to its happy end, but in case of mistakes the user will return to the initial prologue scene or to some intermediate scene. The diagram below shows a possible scheme of a typical scene: one prologue, a winning stream (green) a few branches (yellow are intermediate, red are losing branches) and an happy end. So the user somewhere in TRACK1 will be challenged to take a decision; if he/she is right then the game will continue with TRACK2, if not it will enter in the yellow TRACK4, and so on.

iPhone & iPad: Movie Game Storyboard
What I have in my hands is the full set of tracks, each track representing a specific subsection of a scene, and a storyboard which gives me the rules to be followed in order to build the final story. So the storyboard is made of the scenes, of the tracks the compose each scene and of the rules that establish the flow through these tracks. The main challenge for the developer is to put together these clips and play a specific video based on the current state of the storyboard, then advance to the next, select a new clip again and so on: all should be smooth and interruptions limited. Besides the user needs to take his decisions by interacting with the app and this can be done by overlapping the movie with some custom controls.

The AV Foundation Framework

Trying to reach the objectives explained in the previous paragraph using the standard Media Framework view controllers, MPMoviePlayerController and MPMoviePlayerViewController, would be impossible. These conrollers are good to play a movie and provide the system controls, with full-screen and device rotation support, but absolutely not for advanced controls. Since the release of iPhone 3GS the camera utility had some trimming and export capabilities, but these capabilities were not given to developers through public functions of the SDK. With the introduction of iOS 4 the activity done by Apple with the development of the iMovie app has given the developers a rich set of classes that allow full video manipulation. All these classes have been collected and exported in a single public framework, called AV Foundation. This framework exists since iOS 2.2, at that time it was dedicated to audio management with the well known AVAudioPlayer class, then it has been extended in iOS 3 with the AVAudioRecorder and AVAudioSession classes but the full set of features that allow advanced video capabilities took place only since iOS 4 and they were fully presented at WWDC 2010.

The position of AV Foundation in the iOS Frameworks stack is just below UIKit, behind the application layer, and immediately above the basic Core Services frameworks, in particular Core Media which is used by AF Foundation to import basic timing structures and functions needed for media management. In any case you can note the different position in the stack in comparison with the very high-level Media Player. This means that this kind of framework cannot offer a plug-and-play class for simple video playing but you will appreciate the high-level and modern concepts that are behind this framework, for sure we are not at the same level of older frameworks such as Core Audio.

(image source: from Apple iOS Developer Library)

Building blocks

The classes organization of AV Foundation is quite intuitive. The starting point and main building block is given by AVAsset. AVAsset represents a static media object and it is essentially an aggregate of tracks which are timed representation of a part of the media. All tracks are of uniform type, so we can have audio tracks, video tracks, subtitle tracks, and a complex asset can be made of more tracks of the same type, e.g. we can have multiple audio tracks. In most cases an asset is made of an audio and a video track. Note that AVAsset is an abstract class so it is unrelated to the physical representation of the media it represents; besides creating an AVAsset instance doesn’t mean that we have the whole media ready to be played, it is a pure abstract object.


There are two concrete asset classes available: AVURLAsset, to represent a media in a local file or in the network, and AVComposition (together with its mutable variant AVMutableComposition) for an asset composed by multiple media. To create an asset from a file we need to provide its file URL:

NSDictionary *optionsDictionary = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *myAsset = [AVURLAsset URLAssetWithURL:assetURL options:optionsDictionary];

The options dictionary can be nil, but for our purposes – that is making a movie composition – we need to calculate the duration exactly and provide random access to the media. This extra option, that is setting to YES the AVURLAssetPreferPreciseDurationAndTimingKey key, could require extra time during asset initialization, and this depends on the movie format. If this movie is in QuickTime or MPEG-4 then the file contains additional summary information that cancels this extra parsing time; but the are other formats, like MP3, where this information can be extracted only after media file decoding, in such case the initialization time is not negligible. This is a first recommendation we give to developers: please use the right file format depending on the application.
In our application we already know the characteristics of the movies we are using, but in a different kind of application, where you must do some editing from user imported movies, you may be interested in inspecting the asset properties. In such case we must remember the basic rule that initializing an asset doesn’t mean we loaded and decoded the whole asset in memory: this means that every property of the media file can be inspected but this could require some extra time. For completeness we simply introduce the way asset inspection can be done leaving the interested user to the reference documentation (see the suggested readings list at the end of this post). Basically each asset property can be inspected using an asynchronous protocol called AVAsynchronousKeyValueLoadingwhich defines two methods:

- (AVKeyValueStatus)statusOfValueForKey:(NSString *)key error:(NSError **)outError
– (void)loadValuesAsynchronouslyForKeys:(NSArray *)keys completionHandler:(void (^)(void))handler

The first method is synchronous and immediately returns the knowledge status of the specified value. E.g. you can ask for the status of “duration” and the method will return one of these possible statuses: loaded, loading, failed, unknown, cancelled. In the first case the key value is known and then the value can be immediately retrieved. In case the value is unknown it is appropriate to call the loadValuesAsynchronouslyForKeys:completionHandler: method which at the end of the operation will call the callback given in the completionHandlerblock, which in turn will query the status again for the appropriate action.

Video composition

As I said at the beginning, my storyboard is made by a set of scenes and each scene is composed by several clips whose playing order is not known a priori. Each scene behaves separately from the others so we’ll create a composition for each scene. When we get a set of assets, or tracks, and from them we build a composition all in all we are creating another asset. This is the reason why the AVComposition and AVMutableComposition classes are infact subclasses of the base AVAsset class.
You can add media content inside a mutable composition by simply selecting a segment of an asset, and adding it to a specific range of the new composition:

- (BOOL)insertTimeRange:(CMTimeRange)timeRange ofAsset:(AVAsset *)asset atTime:(CMTime)startTime error:(NSError **)outError

In our example we have a set of tracks and we want to add them one after the other in order to generate a continous set of clips. So the code can be simply written in this way:

 

    AVMutableComposition = [AVMutableComposition composition];
CMTime current = kCMTimeZero;
NSError *compositionError = nil;
for(AVAsset *asset in listOfMovies) {
BOOL result = [composition insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration])
ofAsset:asset
atTime:current
error:&compositionError];
if(!result) {
if(compositionError) {
// manage the composition error case
}
} else {
current = CMTimeAdd(current, [asset duration]);
}
}

First of all we introduced the time concept. Note that all media have a concept of time different than the usual. First of all time can move back and forth, besides the time rate can be higher or lower than 1x if you are playing the movie in slow motion or in fast forward. Besides it is considered more convenient to represent time not as floating point or integer number but as rational numbers. For such reason Core Media framework provides the CMTimestructure and a set of functions and macros that simplify the manipulation of these structures. So in order to build a specific time instance we do:

CMTime myTime = CMTimeMake(value,timescale);

which infact specifies a number of seconds given by value/timescale. The main reason for this choice is that movies are made of frames and frames are paced at a fixed ration per second. So for example if we have a clip which has been shot at 25 fps, then it would be convenient to represent the single frame interval as a CMTime variable set with value=1 and timescale=25, corresponding to 1/25th of second. 1 second will be given by a CMTime with value=25 and timescale=25, and so on (of course you can still work with pure seconds if you like, simply use the CMTimeMakeWithSeconds(seconds) function). So in the code above we initially set the current time to 0 seconds (kCMTimeZero) then start iterating on all of our movies which are assets in . Then we add each of these assets in the current position of our composition using their full range ([asset duration]). For every asset we move our composition head (current) for the length (in CMTime) of the asset. At this point our composition is made of the full set of tracks added in sequence. We can now play them.

Playing an asset

The AVFoundation framework doesn’t offer any built-in full player as we are used to see with MPMovieViewController. The engine that manages the playing state of an asset is provided by the AVPlayer class. This class takes care of all aspects related to playing an asset and essentially it is the only class in AV Foundation that interacts with the application view controllers to keep in sync the application logic with the playing status: this is relevant for the kind of application we are considering in this example, as the playback state may change during the movie execution based on specific user interactions in specific moments inside the movie. However we don’t have a direct relation between AVAsset and AVPlayer as their connection is mediated by another class called AVPlayerItem This class organizations has the pure purpose to separate the asset, considered as a static entity, from the player, purely dynamic, by providing an intermediate object, the that represent a specific presentation state for an asset. This means that to a given and unique asset we can associate multiple player items, all representing different states of the same asset and played by different players. So the flow in such case is from a given asset create a player item and then assign it to the final player.

AVPlayerItem *compositionPlayerItem = [AVPlayerItem playerItemWithAsset:composition];
AVPlayer *compositionPlayer = [AVPlayer playerWithPlayerItem:compositionPlayerItem];

 

In order to be rendered on screen we have to provide a view capable of rendering the current playing status. We already said that iOS doesn’t offer an on-the-shelf view for this purpose, but what it offers is a special CoreAnimation layer called AVPlayerLayer. Then you can insert this layer in your player view layer hierarchy or, as in the example below, use this layer as the base layer for this view. So the suggested approach in such case is to create a custom MovieViewer and set AVPlayerLayeras base layer class:

// MovieViewer.h

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
@interface MovieViewer : UIView {
}
@property (nonatomic, retain) AVPlayer *player;
@end

// MovieViewer.m

@implementation MovieViewer
+ (Class)layerClass {
return [AVPlayerLayer class];
}
– (AVPlayer*)player {
return [(AVPlayerLayer *)[self layer] player];
}
– (void)setPlayer:(AVPlayer *)player {
[(AVPlayerLayer *)[self layer] setPlayer:player];
}
@end

// Intantiating MovieViewer in the scene view controller
// We suppose “viewer” has been loaded from a nib file
// MovieViewer *viewer
[viewer setPlayer:compositionPlayer];

At this point we can play the movie, which is quite simple:

[[view player] play];
Observing playback status

It is relevant for our application to monitor the status of the playback and to observe some particular timed events occurring during the playback.
As far as status monitoring, you will follow the standard KVO based approach by observing changes in the status property of the player:

// inside the SceneViewController.m class we’ll register to player status changes
[viewer.player addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:NULL];

// and then we implement the observation callback
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
    if(object==viewer.player) {
        AVPlayer *player = (AVPlayer *)object;
        if(player.status==AVPlayerStatusFailed) {
      // manage failure
        } else if(playe.status==AVPlayerStatusReadyToPlay) {
      // player ready: manage success state (e.g. by playing the movie)
        } else if(player.status==AVPlayerStatusUnknown) {
      // the player is still not ready: manage this waiting status
        }
    }
}

Differently from the KVO-observable properties timed-events observation is not based on KVO: the reason for this is that the player head moves continuously and usually playing is done on a dedicated thread. So the system certainly prefers to send its notifications through a dedicated channel, that in such case consists in a block-based callback that we can register to track such events. We have two ways to observe timed events:

  • registering for periodic intervals notifications
  • registering when particular times are traversed

In both methods the user will be able to specify a serial queue where the callbacks will be dispatched to (and it defaults to the main queue) and of course the callblack block. It is relevant to note the serial behaviour of the queue: this means that all events will be queued and executed one by one; for frequent events you must ensure that these blocks are executed fast enough to allow the queue to process the next blocks and this is especially true if you’re executing the block in the main thread, to avoid the application to become unresponsive. Don’t forget to schedule this block to be run in the main thread if you update the UI.
Registration to periodic intervals is done in this way, where we ask for a 1 second callback whose main purpose will be to refresh the UI (typically updating a progress bar and the current playback time):

// somewhere inside SceneController.m
id periodicObserver = [viewer.player addPeriodicTimeObserverForInterval:CMTimeMakeWithSeconds(1.0) queue:NULL usingBlock:^(CMTime time){
[viewer updateUI];
}];
[periodicObserver retain];

// and in the clean up method
-(void)cleanUp {
[viewer.player removeTimeObserver:periodicObserver];
[periodicObserver release];
}

// inside MovieViewer.m
-(void)updateUI {
// do other stuff here
// …
// we calculate the playback progress ratio by dividing current position of playhead into the total movie duration
float progress = CMTimeGetSeconds(player.currentTime)/CMTimeGetSeconds(player.currentItem.duration);
// then we update the movie viewer progress bar
[progressBar setProgress:progress];
}

 

Registration to timed events is done using a similar method which takes as argument a list of NSValue representations of CMTime (AVFoundation provides a NSValue category that adds CMTime support to NSValue):

// somewhere inside SceneController.m
id boundaryObserver = [viewer.player addBoundaryTimeObserverForTimes:timedEvents queue:NULL usingBlock:^{
[viewer processTimedEvent];
}];
[boundaryObserver retain];
// inside MovieViewer.m
-(void)processTimedEvent {
// do something in the UI
}
In both cases we need to unregister and deallocate somewhere in our scene controller the two observer opaque objects; we may suppose the existence of a cleanup method that will be assigned this task:
-(void)cleanUp {
[viewer.player removeTimeObserver:periodicObserver];
[periodicObserver release];
[viewer.player removeTimeObserver:boundaryObserver];
[boundaryObserver release];
}

While this code is the general way to call an event, in our application it is more appropriate to assign to each event a specific action, that is we need to customize each handling block. Looking at the picture below, you may see that at specific timed intervals inside each of our clips we assigned a specific event.


The figure is quite complex and not all relationships have been highlighted. Essentially what you can see is the “winning” sequence made of all green blocks: they have been placed consecutively in order to avoid the playhead jumping to different segments when the player takes the right decisions, so the playback will continue without interruption and will be smooth. With the exception of the prologue track, which is just a prologue of the history and no user interaction is required at the stage, and is corresponding conclusion, simply an epilogue when the user is invited to go to the next scene, all other tracks have been marked by a few timed events, identified with the dashed red vertical lines. Essentially we have identified 4 kind of events:

  • segment (clip) starting point: this will be used as a destination point for the playhead in case of jump;
  • show controls: all user controls will be displayed on screen, user intercation is expected;
  • hide controls: all user controls are hidden, and no more user interaction is allowed;
  • decision point, usually coincident with the hide controls event: the controller must decide which movie segment must be played based on the user decision.

Note that this approach is quite flexible and in theory you can any kind of event, this depends on the fantasy of the game designers. From the point of view of the code, we infact subclassed the AVURLAsset by adding an array of timed events definitions. At the time of the composition creation, this events will be re-timed according to the new time base (e.g.: if an event is played at 0:35 seconds of a clip, but the starting point of the clip is exactly at 1:45 of the entire sequence, the the event must be re-timed to 1:45 + 0:35 = 2:20). At this point, with the full list of events we can re-write our boundary registration:

// events is the array of all re-timed events in the complete composition
__block __typeof__(self) _self = self; // avoids retain cycle on self when used inside the block
[events enumerateObjectsUsingBlock:^(id obj, NSUInteger idx, BOOL *stop) {
TimedEvent *ev = (TimedEvent *)obj;
[viewer.player addBoundaryTimeObserverForTimes:[NSArray arrayWithObject:[NSValue valueWithCMTime:ev.time]]
queue:dispatch_get_main_queue()
usingBlock:^{
// send event to interactiveView
[viewer performTimedEvent:ev];
[_self performTimedEvent:ev];
}];
}];

 

 

As you can see the code is quite simple: for each timed event we register a single boundary which simply calls two methods, one for the movie viewer and one for the scene controller; in both cases we send the specific event so the receiver will know exactly what to do. The viewer will normally take care of UI interaction (it will overlay a few controls on top of the player layer, so according to the events these controls will be shown or hidden; besides the viewer knows which control has been selected by the user) while the scene controller will manage the game logic, especially in the case of the decision events. When the controller finds a decision event, it must move the playhead to the right position in the composition:

 

 CMTime goToTime = # determines the starting time of the next segment #
[viewer hide];
[viewer.player seekToTime:goToTime toleranceBefore:kCMTimeZero toleranceAfter:kCMTimePositiveInfinity completionHandler:^(BOOL finished) {
if(finished) {
dispatch_async(dispatch_get_main_queue(), ^{
[viewer show];
});
);
}];

 

What happens in the code above is that in case we need to move the playhead to a specific time, we first determine this time then we ask the AVPlayer instance to seek to this time by trying to move the head in this position or after with some tolerance (kCMTimePositiveInfinity) but not before (kCMTimeZero in the toleranceBefore: parameter; we need this because the composition is made of all consecutive clips and then moving the playhead before the starting time of our clip could show a small portion of the previous clip). Note that this operation is not immediate and even if quite faster it could take about one second. What happens during this transition is that the player layer will show a still frame somewhere in the destination time region, than will start decoding the full clip and will resume playback starting from another frame, usually different than the still one. The final effect is not really good and after a few experimentation a decided to hide the player layer immediately before starting seeking and showing it again as soon the player class informs me (through the completionHandler callback block) that the movie is ready to be played again.

Conclusions and references

I hope this long post will push other developers to start working on interactive movie apps that will try to leverage the advanced video editing capabilities of iOS other than for video editing. The AVFoundation framework offers us very powerful tools and which are not difficult to use. In this post I didn’t explore some more advanced classes, such as AVVideoComposition and AVSynchronizedLayer. The former is used to create transitions, the latter is use to synchronize core animation effects with the internal media timing.

Great references on the subject can be found in the iOS Developer Library or WWDC videos and sample code:

  • For a general overview: AVFoundation Programming Guide in the iOS Developer Library
  • For the framework classes documentation: AVFoundation Framework Reference in the iOS Developer Library
  • Video: Session 405 – Discovering AV Foundation from WWDC 2010, available in iTunesU to registered developers
  • Video: Session 407 – Editing Media with AV Foundation from WWDC 2010, available in iTunesU to registered developers
  • Video: Session 405 – Exploring AV Foundation from WWDC 2010, available in iTunesU to registered developers
  • Video: Session 415 – Working with Media in AV Foundation from WWDC 2011, available in iTunesU to registered developers
  • Sample code: AVPlayDemo from WWDC 2010 sample code repository
  • Sample code: AVEditDemo from WWDC 2010 sample code repository

 

Writed by Carlo Vigiani

iPhone5 has a larger screen than its predecessors. Developers iOS6 must support resolutions of 640 x 1136 px instead of 640 x 960 px pf iPhone4.
But even in this case if you follow the logic Apple work to be done is not at all complicated.

The blog   http://blog.mugunthkumar.com/coding/supporting-the-iphone-5/ proposes to follow four phases:

Phase 1:

iPhone 5 requires a new set of instructions, armv7s. Only in the last version of Xcode (4.5) supports the generation instruction set armv7s. Doa note that Xcode 4.5 no longer supports armv6 and deplores iPhone 3G and older devices. So we must now develop our application using Xcode 4.5

Phase 2:

The next step is to add a picture to launch (Default-568h@2x.png). When you build the project with Xcode 4.5, you receive a warning, “Missing 4 Retina launch image”. Click “Add” to add a default image to the project.

 

Step 3:

However, most of the nib file still will not be scaled correctly. The next step is to check the mask to automatically resize (auto resizing mask) of all the nib file and make sure that the view (view) in the nib file is automatically sized according to the new height of the view.

 

The properties that are used are:

UIViewAutoresizingFlexibleTopMargin,
UIViewAutoresizingFlexibleBottomMargin,
UIViewAutoresizingFlexibleHeight.

It uses UIViewAutoresizingFlexibleHeight for display on top so that car size with the main window. It uses the UIViewAutoresizingFlexibleTopMargin and / or UIViewAutoresizingFlexibleBottomMargin for subviews.

UIViewAutoresizingFlexibleTopMargin is used if you want the subview eimanga “nailed” to the bottom (top edge is flexible) and UIViewAutoresizingFlexibleBottomMargin is used if you want the secondary display is “pinned” to the top (I bottom is flexible).

If you are using Cocoa Auto Layout, this step becomes optional. However, self Layout is not supported on iOS 5.

 

Finally, any Layer that you have added to the view must be manually resized. The following code shows how to do this. We use patternLayer to add a pattern for all view controllers. You need to resize the method viewWillLayoutSubviews.

 

-(void)viewWillLayoutSubviews {

self.patternLayer.frame = self.view.bounds;
[super viewWillLayoutSubviews];
}Step 5 (if you were a messy coder):

 

step 5

If the height of the view was encoded at 460 or 480, you may need to change all iinsrendo bounds. For example,

 

self.window = [[UIWindow alloc] initWithFrame: [[mainScreen UIScreen] bounds]];

instead of

self.window = [[UIWindow alloc] initWithFrame: CGRectMake (0, 0, 320, 480)];

 

Create images with the new dimensions

As I could see on the blog http://redth.info/get-your-monotouch-apps-ready-for-iphone-5-ios-6-today/ Unfortunately, the naming convention of image-568h @ 2x. png only seems to be used for the image by default, but does not apply to other images of ‘application. This means that if you are using a custom background image for display (eg UITableView background), you may need to create a new background image at the correct resolution, and the application to determine when to use each image.
It would be nice if Apple had extended into the new SDK support for the new screen using the method:
[UIImage imageNamed:@"my-image"]

Currently I can point to “my-image” the name of my image (without extension) and the operating system looks for the image in the application bundle according to this criterion: if the screen is an image with the search type retina @ 2x suffix in the name, if not found looks for the image without suffix. We would have expected from Apple to extend the algorithm to include the ability to search for the suffix-568h @ 2x in the case of screen sizes 4 “. Unfortunately it is not and that is why we encode it explicitly in our code.

For example, in our non-4inch compatible app, I have two images:

Images / TableViewBackground.png – 320×358
Images / TableViewBackground@2x.png – 640×716

With the new resolution, I need to create a third image (we decided to use the option-568h @ 2x.png naming convention, even if it is not processed by Apple):

Images/TableViewBackground-568h@2x.png

An elegant approach is to create a new category for UIImage class (with a little imagination we call UIImage + Retina4), and perform at runtime within the category of a substitution method “imageNamed:” with one that can handle the new Convention:


// inside UIImage+Retina4.h
#import

@interface UIImage (Retina4)

@end

// all’interno di UIImage+Retina4.m
#import “UIImage+Retina4.h”
#ifdef TARGET_MAC_OS
#import
#else
#import
#endif

static Method origImageNamedMethod = nil;

@implementation UIImage (Retina4)

+ (void)initialize {
origImageNamedMethod = class_getClassMethod(self, @selector(imageNamed:));
method_exchangeImplementations(origImageNamedMethod,
class_getClassMethod(self, @selector(retina4ImageNamed:)));
}

+ (UIImage *)retina4ImageNamed:(NSString *)imageName {
NSMutableString *imageNameMutable = [imageName mutableCopy];
NSRange retinaAtSymbol = [imageName rangeOfString:@"@"];
if (retinaAtSymbol.location != NSNotFound) {
[imageNameMutable insertString:@"-568h" atIndex:retinaAtSymbol.location];
} else {
CGFloat screenHeight = [UIScreen mainScreen].bounds.size.height;
if ([UIScreen mainScreen].scale == 2.f && screenHeight == 568.0f) {
NSRange dot = [imageName rangeOfString:@"."];
if (dot.location != NSNotFound) {
[imageNameMutable insertString:@"-568h@2x" atIndex:dot.location];
} else {
[imageNameMutable appendString:@"-568h@2x"];
}
}
}
NSString *imagePath = [[NSBundle mainBundle] pathForResource:imageNameMutable ofType:@”png”];
if (imagePath) {
return [UIImage retina4ImageNamed:imageNameMutable];
} else {
return [UIImage retina4ImageNamed:imageName];
}
return nil;
}

@end

What this code does is initializing replace Apple’s implementation of “imageNamed:” with our “retina4ImageNamed:” (and vice versa). At the very moment when the runtime calls “imageNamed:” actually going to call our function that will load the image optimized for the screen to 4 “on condition that it exists and that we are running the app on a device with this screen (including the simulator). If the image is not present or the screen is the traditional 3.5 “would then call the original function (renamed due to the initial exchange).
Obviously, this implementation can not be used in case the loading of images occurs explicitly by means of calls of the type
[UIImage imageWithContentsOfFile ...]
in which the name of the file to be constructed explicitly.

Recently in some app, we have significantly reduced the size of the IPA file that we send to Apple.
We develope an app with a lot of graphic content an we have reduced about 60% the size of the packet  sent to Apple, the so-called bundle file.

When you develop a universal iPad / iPhone app, and, above all, if you want to add support for viewing the retina to the new iPad 3, we can rest with the Size of the package rather excessive.
Many apps are designed for a multitude of textures in full screen, this is because many designers prefer to illustrate their ideas as better. These images, if you think iPad 3, for the dimesion of 2048 x 1536, yhe PNGs can be very heavy. The conversion of some of these images in JPEG format will save a lot of space.
It ‘a shame that the jpeg does not load faster, but some NPCs may also be more than 10MB, when they are in jpeg converted tha weigh is about ~ 200Kb.
This is a good first step, but you have to be very careful to avoid damaging the quality.

The conversion from png to jpg is not enough: after you convert as many great texture, the weight of the bundle IPA can still be consistent.
Our goal is to be almost always under 20 MB so  the old devices can download the app even without the WiFi.

 

ImageOptim: Basterà lanciare al suo interno la singola immagine o anche un gruppo, e saranno immediatamente compressi senza ridurre la qualità . Si riduce in media di circa il 15–35 %

Reading between the different sites I found some useful tips in the blog of Sam Soffes (http://samsoff.es/posts/image-optimization-on-ios) in his article recommends the use of ImageOptim, a small program that optimizes images. With this app for the Mac osx Tiflis are processed so as to occupy less disk space and so they can be loaded more quickly,” through the use of providing the best compression parameters to remove unnecessary comments and color profiles. The software handles PNG, JPEG and GIF animation.
ImageOptim integrates the various optimization tools: PNGOUT, AdvPNG, pngcrush, extended OptiPNG, jpegoptim, jpegrescan, and jpegtran Gifsicle.
Particularly suitable for publishing images on the web (shrinks images easily “saved” for the Web in Photoshop) is useful for creating applications rivelto Mac and iPhone / iPad ever read.

Also you can convert many images as PNG8, Photoshop, under File> Save for Web & ‘can export as PNG24 PNG8 instead of those images that are fine as GIF. Even if it does not support alpha variable, is ideal for simple images.
Sam considers ImageOptim even fantastic, processes the images through a series of tools squeezing as much as possible and without reducing quality. Even if you saved “for-web” all images, ImageOptim is able to compress more than 50%. Some files have a dramatic reduction up to 90%.

Sam has processed all the images through ImageOptim twice. The second time it was able to compress some images even more in depth.
It ‘important to note that this is all done in lossless compression.

 

It is possible to make the images even more read ImageAlpha (from the creators of ImageOptim). It is a tool that allows you to create PNG8 images with alpha variable.
This will save a lot of space. This is a more manual process than ImageOptim, but it works well to enlarge the images with alpha that does not have a lot of colors.

Note: Be sure to turn off image optimization Xcode or you will undo all your hard work when crerete your bundle.

Conclusion

We have heard mixed reviews on these products. Some claim to have had problems with these instruments, while others are fully satisfied.
In our experience it works quite well;
if you are trying to reduce the size of the app (bundle), then we recommend that you use these tools.

For the most ‘experts:
If you want to try to make an even higher compression, you can try to use the fork di Scribd del AdvanceCOMP John Englehart (the creator of JSONKit);
it is a product a little too hardcore, but we would recommend to most people-‘-geeks give it a try.

 

Normally developers do not have Photoshop and aren’t design guru and, in general, certainly hate to redo their work to meet the needs of the expert designers.
So we decided to report a infographic guide designed for designers who want to make App for IOS and help send the correct files for the developer.

The infographic style explain everything with an image, we propose that vo below:

 


Source

First of all, as we see in the image needs to remember the size of different device from Apple.

Iphone and ipod touch until the third generation: 320 x 480 px / 163 PPI
Iphone4 and ipod tpuch generation: 640 x 960 px / 326 PPI
iPad & iPad2: 1024 x 768 px / 132 PPI
iPad 3: 2048 x 1536 px / 264 PPI

Apple recommends into using icons:

44×44 px
for control buttons

 

Another suggestion, you have to prepare the icons starting from size 512 x 512 px and 1024×1024 px in case of iPad with retinal display, always paying attention to visonare design when they are so much smaller.

The rest of the tips are much more clear if you read carefully what is reported in the picture posted above.

Source: infographic post, that all summed up by an image inserted in the FSM blog (http://www.funkyspacemonkey.com/ios-app-designers-guide-infographic)

 

When you decide to design an app you must always follow the basic principles of industrial design.
Many people think of this commissioning an app, but when you are describing the application and then how their idea can be translated by the user experience and graphic interface (User Interface & User Experience), they are unprepared and very often they hide behind phrases like “I do not know this is a job for engineers, lets see this to the technicians …see ye.”

Needless to say,  when the  “technicians” get to work these people, who have no concept to delegate, will begin to demand substantive changes giving advice and information of any kind and almost always only after the app came to final stage of its development.
And the well-known concept that the “technical”, and engineers, first build the core of the application and then they adapt the design and  they do the opposite, in spite of themselves, only if the commitment are valid and convincing, especially when this is decided right from beginning of the design.
In accordance with the approach of “you do that then we see”, wanted by professionals distracted and ill prepared, the final aesthetic result can be poor which seem as every engineer knows that, before starting to write code, you need to have clear UI principles with a description of the functions related to the experience of the user.

Some sophists can criticize me for using the word “user”, which sometimes is not very appealing if you think that the end users are just people, or individuals users. This difference in meaning of words is very clear to me, but for ease of communication and especially for translation needs prefer to use the word “user” or “users” instead of ‘”individual”.

 

10 principles for a good design of an app and for a product

First of all, to quote Steve Jobs, I propose a definition of design that has convinced me more:
“Design is the fundamental soul of a man-made creation that ends up expressing itself in successive outer layers of the product or service.”

Of course, the same principles Jobs was inspired by Dieter Rams, former Braun’s designers, who enumerated his 10 principles for good design of a product:

 

Dieter Rams e i suoi prodotti di design

 

  • Dieter Rams Ten Principles of “Good Design”
    1. Good Design Is Innovative : The possibilities for innovation are not, by any means, exhausted. Technological development is always offering new opportunities for innovative design. But innovative design always develops in tandem with innovative technology, and can never be an end in itself.
    2. Good Design Makes a Product Useful : A product is bought to be used. It has to satisfy certain criteria, not only functional but also psychological and aesthetic. Good design emphasizes the usefulness of a product while disregarding anything that could possibly detract from it.
    3. Good Design Is Aesthetic : The aesthetic quality of a product is integral to its usefulness because products are used every day and have an effect on people and their well-being. Only well-executed objects can be beautiful.
    4. Good Design Makes A Product Understandable : It clarifies the product’s structure. Better still, it can make the product clearly express its function by making use of the user’s intuition. At best, it is self-explanatory.
    5. Good Design Is Unobtrusive : Products fulfilling a purpose are like tools. They are neither decorative objects nor works of art. Their design should therefore be both neutral and restrained, to leave room for the user’s self-expression.
    6. Good Design Is Honest : It does not make a product more innovative, powerful or valuable than it really is. It does not attempt to manipulate the consumer with promises that cannot be kept
    7. Good Design Is Long-lasting : It avoids being fashionable and therefore never appears antiquated. Unlike fashionable design, it lasts many years – even in today’s throwaway society.
    8. Good Design Is Thorough Down to the Last Detail : Nothing must be arbitrary or left to chance. Care and accuracy in the design process show respect towards the consumer.
    9. Good Design Is Environmentally Friendly : Design makes an important contribution to the preservation of the environment. It conserves resources and minimises physical and visual pollution throughout the lifecycle of the product.
    10. Good Design Is as Little Design as Possible : Less, but better – because it concentrates on the essential aspects, and the products are not burdened with non-essentials. Back to purity, back to simplicity.

 

Of course it is easy to see that these principles will be adapted to the design of industrial products, but also for the design of applications, especially if they will be used on products that were built precisely according to the principles of good industrial design, as are all products Apple.

Design better, work less

Dieter Rams, creator of the 10 principles, has always expressed his approach to design with the phrase: “Weniger, aber besser” or “Less, but better.”
Minimalism, as well as being very elegant, is certainly the best way to allow all users-users to understand instinctively the product and its functionality and it makes the product itself, or the App, friendly to use (user friendly) and “pure”.

Heuristic evaluation

At this point I can only describe even the so-called heuristic evaluation.
The Heuristic Evaluation is a method of inspection that is performed exclusively by the experts of usability and allows to evaluate whether a set of general design principles have been applied correctly in the UI.
The guidelines (“Ten Usability Heuristics“) upon which this sort of evaluation were developed in 1990 by Jakob Nielsen and Rolf Molich and are designed for desktop software, but in this case, these principles are still valid for designed for touchscreen applications, such as the iPhone OS App for iPhone and iPad app for Android and Windows Mobile.

 

With the heuristic evaluation is detected then the fidelity and adherence to the principles of usability of the product, you can find on  Wikipedia ( http://en.wikipedia.org/wiki/Usability )

This method, which as we said, is a type of inspection, provides the only involvement of usability experts and does not call into question the end-users: for this reason it is easy to perform, cheap and fast but does not take into account the possible evolution of the needs of public and therefore, in my humble opinion, is certainly very useful but if it owns it in the limit of being inflexible, and the lack of flexibility can usually castrate the creative evolution.

The heuristic evaluation test , therefore consists in a series of navigation of the product which are carried out separately by each “expert”. During the test use, the software product is evaluated for both static aspects of the interface, such as window layouts, labels, buttons etc.., And for the dynamic aspects of interaction and (logical processes and flows).
After finishing the investigation, experts will gather in brainstorming, check the results and compare them with the principles provided in the guideline to reach some common conclusions.

Conclusions

The heuristic evaluation method is certainly very useful and often necessary, but it can also be done instinctively , if the “expert” who heads the app is an old business guru.

I doubt that when you follow these methods, very hard, is that you can easily fall in the risk assessments of caging in a bureaucratic system – with its sculpted rules – which severely limits the creative people, as suggested by the same creator iPhone and iPad, “think Different”.

Think Different is in fact always been the key to the success of each product in each sector.

Obviously none of the great success stories, “Think Different” model-based , has never ignored the existence of principles that Nielsen is one of the cultural foundations of this industry.
We must never ignore the basics, but even being locked in a few principles, how big and important they are, if you want to try to be innovative and revolutionary.

Increasingly, our customers and readers submit proposals for developing an app without using standard documents and without using those terminologies that may help to understand properly how many and which features should have given the app.

If you want to convey an idea that allows you to create an app for iPhone, iPad or Android is necessary to prepare a clear Mockapp and / or Functional Document which will contain all the detailed features of the app for your iPhone or iPad or Android together with a layout that makes understanding the User Interface and User Experience.


Cooker App – Design, Mockup e Prototype Apps interfacce app in iOS

If you want to use your iPad as a tool to prepare the mockapp prototype, we mark App Cooker (website: http://www.appcooker.com/).

We will report instead of in the next article on what are the tools that you can use with your Mac or PC.

Many people have abandoned almost entirely the use of computers and rely on the iPad to send emails, take advantage of the Internet and use tools;

App Coocker  is an app for iPad and consider it a great tool if you want to bring your ideas to a stage of possible realization.

App Cooker, is present in the App Store at a price of € 15.99, which according to some rumors will increase to $ 24 the next version, the application (IPAD) is developed by HotAppsFactory and, as we have said, used to design applications iPhone and iPad ..

The ‘App Board will collect your conceptual plans, mock-ups, icons, App Store and pricing strategy.
It will be the backbone of your project and lets you work in an organized and clear.

Below is a video in English directly from the official site:

Define the ideas

We start with an idea, a sketch and use this app to organize the ideas and inspire.
The idea is the essence of any application and it requires time and careful consideration.
App Cooker provides a dedicated tool for this, providing valuable advice set out by Apple and other industry professionals.

 

iOS MockupiOS Mockups

The engine mockup supports orientation, simple links and unites assim the UI of Apple (UI) design with bitmaps, vector shapes, text and images.
Prototypes can come to life, without a single line of code.

The ‘trademark icon of the app

The icon is the face of your application. The creation of large icons requires experimentation and several attempts until the solution will not be found. Using the freehand tool, or images with vector shapes you can define the look of the icons of your ideas and see the results in various sizes and in no time.

Prices tool
App Cooker allows you to compare a large number of price scenarios to find the right model for your application. Supports both purchases app that advertising, which makes it easy to predict revenues, costs and profits.s

 

Descriptions App Store

Descriptions App Store

The description on the product page of iTunes is a deciding factor for potential buyers. App Cooker makes writing this information a simple task, and provides a place to locate in any of the App Store in 18 languages.

Mockapp before to the development of code

Designing a good application is difficult. You must have creativity, talent, resources, knowledge, time and a strong sense of self-criticism; the app to succeed are the result of a long process of refining.

We spent a long time, with many different clients and companies who contact us, to try to explain the best way to design applications for iPhone, iPad and Android.

With all the time, sooner or later discover that the design of an application is much more than just graphic design.

Below we enunciate the 10 principles that I have collected from the site of the App cooker:

 

In conclusion,

Cooker app is configured as a professional application, since it allows to design all the elements of an application compatible with all Apple devices IOS.

Addressing all those who intend to develop their own application, you must first make clear the initial idea, compatibility with various devices and various functions, then we must create the various graphic elements of the mockup, the icon , location on the App Store and deployment prospects and earnings.

Below we enunciate the 10 principles that I have collected from the site of the App cooker: