• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

robovm/apple-ios-samples

原作者: [db:作者] 来自: 网络 收藏 邀请

开源软件名称:

robovm/apple-ios-samples

开源软件地址:

https://github.com/robovm/apple-ios-samples

开源编程语言:

Objective-C 61.9%

开源软件介绍:

Mirror of Apple's iOS samples

This repository mirrors Apple's iOS samples.

Name Topic Framework Description
ABUIGroups Data Management
(Contact Data)
AddressBook ABUIGroups shows how to check and request access to a user’s address book database. It also demonstrates how to retrieve, add, and remove group records using AddressBook APIs. It displays groups organized by their source in the address book.
AccelerometerGraph Data Management
(Device Information)
UIKit AccelerometerGraph sample application graphs the motion of the device. It demonstrates how to use the UIAccelerometer class and how to use Quartz2D and Core Animation to provide a high performance graph view. It also demonstrates a low-pass filter that you can use to isolate the effects of gravity, and a high-pass filter that you can use to remove the effects of gravity.
Activity Rings: Contributing to Activity Rings on Apple Watch User Experience HealthKit This sample demonstrates the proper way to work with HealthKit on Apple Watch to contribute to the Activity Rings and have your app associated and listed with workouts and within the Move graph in the Activity app on iPhone.
AdaptivePhotos: Using UIKit Traits and Size Classes User Experience UIKit This sample demonstrates how to use UIKit APIs to make your app work great on all devices and any orientation. You'll see how to use size classes, traits, and view controller additions to create an app that displays properly at any size (including iPhone 6 Plus) and and configuration (including iPad Multitasking).
AddMusic Audio & Video
(Audio)
MediaPlayer AddMusic demonstrates basic use of iPod library access, part of the Media Player framework. You use iPod library access to play songs, audio books, and audio podcasts that are synced from a user's desktop iTunes library. This sample uses the Media Player framework's built-in user interface for choosing music.

AddMusic also demonstrates how to mix application audio with iPod library audio. The sample includes code for configuring application audio behavior using the AVAudioSession class and Audio Session Services.
AdvancedURLConnections Networking & Internet
(Protocol Streams)
Foundation This sample demonstrates various advanced networking techniques with NSURLConnection. Specifically, it demonstrates how to respond to authentication challenges, how to modify the default server trust evaluation (for example, to support a server with a self-signed certificate), and how to provide client identities.
AgentsCatalog: Using the Agents System in GameplayKit General GameplayKit Uses the Agents system in GameplayKit to let the characters in a game move themselves according to high-level goals. This sample demonstrates several of the individual goals that an agent can perform, such as moving to a target, avoiding obstacles, and following a predefined path. AgentsCatalog also shows how to tie multiple goals together to create more complex behaviors, such as making a group of agents seek a common target while moving together as a flock.
AirDrop Examples Networking & Internet "AirDropSample" demonstrates three use cases for incorporating AirDrop into an app.
AirLocate: Using CoreLocation to monitor, range, and configure your device as an iBeacon CoreLocation "AirLocate" demonstrates CoreLocation fencing and ranging of iBeacons, BTLE devices enabled to aide iOS devices in determining a users proximity to a location rather than their position. Obtaining a users proximity with iBeacons is ideal in more intimate locations such as indoors where other positioning methods either do not work, or do not give the desired level of accuracy an iBeacon can provide. In addition to demonstrating how to use CoreLocation's CLLocationManager APIs to monitor and range for these CLBeaconRegions, AirLocate also provides an example of how to calibrate and configure an iOS device as a beacon.
Alternate Views User Experience
(Windows & Views)
UIKit This sample demonstrates how to implement alternate or distinguishing views for particular device orientations. Doing so can be useful if your app displays different content between orientations or if your app uses vastly different layouts between orientations which cannot be reconciled by auto layout or programatic layout alone.
Application Icons and Launch Images for iOS General UIKit Every app is required to include an app icon. It is recommended that apps also provide icons for: Spotlight, the Settings app, and when creating an Ad Hoc build and adding it to iTunes. See QA1686: App Icons on iPad and iPhone, for a complete listing of icons required for iPhone, iPad, and Universal apps https://developer.apple.com/library/ios/qa/qa1686/_index.html.
ApplicationShortcuts: Using UIApplicationShortcutItem User Experience UIKit Demonstrates how to use the UIApplicationShortcutItem class to provide quick access to parts of your application directly from the device's home screen. The sample shows two static shortcuts (defined in the app's Info.plist), and two dynamic shortcuts (defined in code with the UIMutableApplicationShortcutItem class). The dynamic shortcuts can be edited to change the title, subtitle and icon.
AppPrefs: Storing and Retrieving User Preferences Data Management
(Preference Settings)
UIKit Demonstrates how to display your app's user configurable options (preferences) in the "Settings" system application. A settings bundle, included in your application’s bundle directory, contains the information needed by the Settings application to display your preferences and make it possible for the user to modify them. The Settings application saves any configured values in the defaults database so that your application can retrieve them at runtime.

This sample also shows how to launch the Settings app from your application and how to dynamically update your application's UI when its settings are changed while the app is in the background.
AQOfflineRenderTest Audio & Video
(Audio)
AudioToolbox Demonstrates using Audio Queue offline render functionality and the AudioQueueOfflineRender API. The sample produces LPCM output buffers from an ALAC encoded source which are then written to a .caf file. The output.caf file is then played back confirming the offline functionality worked as expected. All the code demonstrating the Audio Queue is in a single file called aqofflinerender.cpp.
AstroLayout: Building Adaptive UI with Auto Layout User Experience UIKit AstroLayout demonstrates how to properly activate and deactivate groups of constraints in response to a size class change. It also shows how to animate layout changes using UIView animations. You'll see how to use layout guides and anchors to reduce code overhead and allow for more complex layouts.
Audio Converter File Convert Test Audio & Video
(Audio)
AudioToolbox Demonstrates using the Audio Converter APIs to convert from a PCM audio format to a compressed format including AAC.
Audio Mixer (MixerHost) Audio & Video
(Audio)
AudioUnit MixerHost demonstrates how to use the Multichannel Mixer audio unit in an iOS application. It also demonstrates how to use a render callback function to provide audio to an audio unit input bus. In this sample, the audio delivered by the callback comes from two short loops read from disk. You could use a similar callback, however, to synthesize sounds to feed into a mixer unit. This sample is described in Audio Unit Hosting Guide for iOS.
Audio UI Sounds (SysSound) Audio & Video
(Audio)
AudioToolbox Demonstrates use of System Sound Services (AudioToolbox/AudioServices.h) to play alerts and user-interface sound effects, and to invoke vibration.
AudioUnitV3Example: A Basic AudioUnit Extension and Host Implementation Audio & Video
(Audio)
Demonstrates how to build a fully-functioning example of an Audio Unit extension and Audio Unit host using the version 3 of the Audio Unit APIs. The Audio Unit Extensions API introduces a mechanism for developers to deliver AudioUnits to users on iOS. The same API is available on both iOS and OS X, and provides a bridging mechanism for existing version 2 AudioUnits to coexist in existing AudioUnit host applications, and in future version 3 hosts.
aurioTouch Audio & Video
(Audio)
AudioUnit aurioTouch demonstrates use of the remote i/o audio unit for handling audio input and output. The application can display the input audio in one of the forms, a regular time domain waveform, a frequency domain waveform (computed by performing a fast fourier transform on the incoming signal), and a sonogram view (a view displaying the frequency content of a signal over time, with the color signaling relative power, the y axis being frequency and the x as time). Tap the sonogram button to switch to a sonogram view, tap anywhere on the screen to return to the oscilloscope. Tap the FFT button to perform and display the input data after an FFT transform. Pinch in the oscilloscope view to expand and contract the scale for the x axis.
AVARLDelegateDemo Audio & Video AVFoundation The sample code depicts three different use cases of AVAssetResourceLoaderDelegate (for Identity encryption use case scenarios) for HLS (HTTP Live streaming): - Redirect handler (redirection for the HTTP live streaming media files) - Fetching Encryption keys for the HTTP live streaming media (segments) - Custom play list generation (index file) for the HTTP live streaming.
AVCam-iOS: Using AVFoundation to Capture Images and Movies Audio & Video AVFoundation AVCam demonstrates how to use the AVFoundation capture API to record movies and capture still images. The sample has a record button for recording movies, a camera button for switching between front and back cameras (on supported devices), and a still button for capturing still images. AVCam runs only on an actual device, either an iPad or iPhone, and cannot be run in Simulator.
AVCamManual: Extending AVCam to Use Manual Capture API Audio & Video AVFoundation AVCamManual adds manual controls for focus, exposure, and white balance to the AVCam sample application.
AVCaptureAudioDataOutput To AudioUnit iOS Audio & Video
(Audio)
AVFoundation AVCaptureToAudioUnit for iOS demonstrates how to use the CMSampleBufferRefs vended by AVFoundation's capture AVCaptureAudioDataOutput object with various CoreAudio APIs. The application uses a AVCaptureSession with a AVCaptureAudioDataOutput to capture audio from the default input, applies an effect to that audio using a simple delay effect AudioUnit and writes the modified audio to a file using the CoreAudio ExtAudioFile API. It also demonstrates using and AUGraph containing an AUConverter to convert the AVCaptureAudioDataOutput provided data format into a suitable format for the delay effect.
AVCompositionDebugVieweriOS Audio & Video AVFoundation This sample application has an AVCompositionDebugView which presents a visual description of the underlying AVComposition, AVVideoComposition and AVAudioMix objects which form the composition made using two clips, adding a cross fade transition in between and audio ramps to the two audio tracks. The visualization provided by the sample can be used as a debugging tool to discover issues with an incorrect composition/video composition. For example: a break in video composition would render black frames to screen, which can easily be detected using the visualization in the sample.
AVCustomEdit Audio & Video AVFoundation The sample demonstrates the use of custom compositors to add transitions to an AVMutableComposition. It implements the AVVideoCompositing and AVVideoCompositionInstruction protocols to have access to individual source frames, which are then be rendered using OpenGL off screen rendering.
AVFoundationPiPPlayer: Picture-in-Picture Playback with AVKit AVFoundation Demonstrates how to use the AVPictureInPictureController class to implement picture-in-picture video playback. It shows the steps required to start and stop picture-in-picture mode and to setup a delegate to receive event callbacks. Clients of AVFoundation using the AVPlayerLayer class for media playback must use the AVPictureInPictureController class, whereas clients of AVKit who use the AVPlayerViewController class get picture-in-picture mode without any additional setup.
AVFoundationQueuePlayer-iOS: Using a Mixture of Local File Based Assets and HTTP Live Streaming Assets with AVFoundation Audio & Video AVFoundation Demonstrates how to create a movie queue playback app using only the AVQueuePlayer and AVPlayerLayer classes of AVFoundation (not AVKit). You’ll find out how to manage a queue compromised of local, HTTP-live-streamed, and progressive-download movies. You’ll also see how to implement play, pause, skip, volume adjustment, time slider updating, and scrubbing.
AVFoundationSimplePlayer-iOS: Using AVFoundation to Play Media Audio & Video AVFoundation Demonstrates how to create a simple movie playback app using only the AVPlayer and AVPlayerLayer classes from AVFoundation (not AVKit). You'll see how to open a movie file and then how to implement various functionality including play, pause, fast forward, rewind, volume adjustment, time slider updating, and scrubbing.
AVLoupe Audio & Video
(Video)
AVFoundation This sample demonstrates how to use multiple synchronized AVPlayerLayer instances, associated with a single AVPlayer, to efficiently produce non-trivial presentation of timed visual media. Using just one AVPlayer this sample demonstrates how you can display the same video in multiple AVPlayerLayers simultaneously. With minimal code you can create very customized and creative forms of video display. As an example, this sample demonstrates an interactive loupe, or magnifying glass, for video playback. This is similar to features that you might have used in iPhoto and Aperture.
AVMetadataRecordPlay: Timed Metadata Capture Recording and Playback Audio & Video AVFoundation The AVMetadataRecordPlay sample demonstrates how to use AVFoundation capture APIs to record and play movies with timed metadata content. The sample also shows how to use timed-metadata tracks to record detected-face, video orientation, and GPS metadata. When playing back content, the AVMetadataRecordPlay class reads the detected-face and GPS timed metadata tracks and uses it to render augmentations on the video layer to indicate their values. AVMetadataRecordPlay also reads the video orientation metadata and dynamically adjusts the video layer to properly render the content. The sample runs only on an actual device (iPad or iPhone); you can’t run it in the Simulator.
AVMovieExporter Audio & Video
(Video)
AVFoundation This universal sample application reads movie files from the Asset Library and Media Library then exports them to a new media file using user defined settings. The user can adjust the exported file in the following ways:
AVPlayerDemo Audio & Video
(Video)
AVFoundation Uses AVPlayer to play videos from the iPod Library, Camera Roll, or via iTunes File Sharing. Also displays metadata.
AVReaderWriter: Offline Audio / Video Processing AVFoundation This sample demonstrates how to use AVAssetReader and AVAssetWriter to perform offline (i.e. non-real-time) processing of video and audio.
AVSimpleEditoriOS Audio & Video AVFoundation AVSimpleEditor is a simple AVFoundation based movie editing application which exercises the APIs of AVVideoComposition, AVAudioMix and demonstrates how they can be used for simple video editing tasks. It also demonstrates how they interact with playback (AVPlayerItem) and export (AVAssetExportSession). The application performs trim, rotate, crop, add music, add watermark and export. This sample is ARC-enabled.
AVTimedAnnotationWriter: Using Custom Annotation Metadata for Movie Writing and Playback AVFoundation Demonstrates how to use the AVAssetWriterInputMetadataAdaptor API to write circle annotation metadata during video playback. The captured movie file has video, audio and metadata track. The metadata track contains circle annotation which is vended during playback using AVPlayerItemMetadataOutput.
avTouch Audio & Video
(Audio)
AVFoundation The avTouch sample demonstrates use of the AVAudioPlayer class for basic audio playback.
Bananas: Building a Game with SceneKit Graphics & Animation
(3D Drawing)
SceneKit This sample shows how to build a basic game using Scene Kit, demonstrating physics, rendering techniques, lighting, actions and animation.
Blurring and Tinting an Image Graphics & Animation UIKit UIImageEffects demonstrates how to create and apply blur and tint effects to an image using the vImage, Quartz, and UIKit frameworks. The vImage framework is suited for high-performance image processing. Using vImage, your app gets all the benefits of vector processing without the need for you to write vectorized code.
BonjourWeb Networking & Internet
(Services & Discovery)
Foundation This application illustrates the fundamentals of browsing for network services using Bonjour. The BonjourBrowser hierarchically displays Bonjour domains and services as table views in a navigation controller. The contents of the table views are discovered and updated dynamically using NSNetServiceBrowser objects. Tapping an item in the services table causes the corresponding NSNetService object to be resolved asynchronously. When that resolution completes, a delegate method is called which constructs a URL and opens it in Safari.
BracketStripes: Using the Bracketed Capture API AVFoundation BracketStripes

This sample illustrates the use of still image bracketing APIs available in AVFoundation.

Two types of brackets are demonstrated:

1. Auto-exposure brackets with exposure target bias, and

2. Manual exposure with control over ISO and exposure duration.

As each of the bracketed frames are captured in real-time, they are "striped" into a destination image buffer and later shown in a modal image viewer so each of the captured frames can be compared side-by-side.
Breadcrumb User Experience MapKit Demonstrates how to draw a path using the Map Kit overlay, MKOverlayView, that follows and tracks the user's current location. The included CrumbPath and CrumbPathView overlay and overlay view classes can be used for any path of points that are expected to change over time. It also demonstrates what is needed to track the user's location as a background process.
BTLE Central Peripheral Transfer CoreBluetooth This sample shows how to transfer data from an iOS device in CoreBluetooth Peripheral Mode to another in Central Mode, by using a CBCharacteristic on the Peripheral side that changes its value. The value change is automatically picked up on the Central side.
Checking and Requesting Access to Data Classes in Privacy Settings Security PrivacyPrompts shows how to check and request access to data classes such as Location, Contacts, and social media in Privacy Settings on iOS.
CircleLayout User Experience
(Windows & Views)
UIKit Shows how to use a collection view to arrange views on a circle.
CloudCaptions: How integrate CloudKit into your application CloudKit This sample shows how to use CloudKit to upload and retrieve CKRecords and associated assets. In this example, there are two record types, an image record type and a post record type. Users are able to upload their own photos or select an image already found in an image record type. This example also uses an NSPredicate in its CKQueries to filter results based on tags.
CloudKit Catalog: An Introduction to CloudKit (Cocoa and JavaScript) CloudKit CloudKit Catalog will get you up to speed quickly on CloudKit using either the Cloudkit API for JavaScript or the CloudKit API for iOS. The JavaScript portion of this sample code project is also available as a hosted version at https://cdn.apple-cloudkit.com/cloudkit-catalog/ .
CloudPhotos : Using CloudKit with iOS CloudPhotos is a clear and concise CloudKit sample showing how to share photos among other users. Photos are represented as CKRecords, holding the photo (CKAsset), its title (NSString), creation date (NSDate) and the location (CLLocation) it was created. The sample will display all photos found in the public CloudKit container. Users can view photo records, but only the owner of photo records can change or delete them. The attributes that can be edited are the photo itself and it’s title, the creation date and location data are read-only. Users add photo records from their photo library or camera role.
CloudSearch Data Management
(File Management)
Foundation Demonstrates how to find documents in iCloud, using NSMetaDataQuery. Included as part of this sample is a class called "CloudDocumentsController" which runs Spotlight queries, using NSMetaDataQuery, to discoved files found in iCloud. You can use this class to quickly gain access to those available files.
Collection View Transition User Experience
(Windows & Views)
UIKit This sample illustrates how to create a custom transition when navigating between two collection views in a navigation hierarchy managed by a navigation controller. The transition can be interrupted and reversed. It uses a subclass of UICollectionViewTransitionLayout to help in the transition of the cell positions based on gesture position.
CollectionView-Simple User Experience
(Windows & Views)
UIKit Demonstrates how to use UICollectionView, a way to present ordered data to users in a grid-like fashion. With a collection view object, you are able to define the presentation and arrangement of embedded views. The collection view class works closely with an accompanying layout object to define the placement of individual data items. In this example UIKit provides a standard flow-based layout object that you can use to implement multi-column grids containing items of a standard size.
Core Audio Utility Classes Audio & Video
(Audio)
CoreAudio The "CoreAudio" folder contains the Public Utility sources (PublicUtility folder) as well as base classes required for codec and audio unit development. These utility classes are used by various Apple Core Audio sample project and extend or wrap Core Audio API's.
Core Data Transformable Attributes Data Management CoreData This sample illustrates a Core Data application that uses more than one entity and uses transformable attributes. It also shows inferred migration of the persistent store.
Core Image Filters with Photos and Video for iOS Graphics & Animation
(2D Drawing)
CoreImage The CIFunHouse project shows how to apply Core Image built in and custom CIFilters to photos and video. The application presents view controllers for adding photo and video sources, choosing CIFilters from a list, and making live adjustments to filter parameters. The project also contains code for custom CIFilter subclasses for effect such as Sobel edge detection, old-style-film, and fake-depth-of-field looks. The code also demonstrates how to save a filtered video stream to the ALAssetsLibrary while simultaneously previewing the video on the display.
CoreBluetooth Temperature Sensor CoreBluetooth A simple iOS iPhone application that demonstrates how to use the CoreBluetooth Framework to connect to a Bluetooth LE peripheral and read, write and be notified of changes to the characteristics of the peripheral.
CoreDataBooks Data Management CoreData This sample illustrates a number of aspects of working with the Core Data framework with an iOS application:
CoreTextPageViewer User Experience
(Windows & Views)
CoreText This sample shows how to use Core Text to display large bodies of text, text with mixed styles, and text with special style or layout requirements, such as use of custom fonts. A version of this sample was used in the "Advanced Text Handling for iPhone OS" WWDC 2010 Session.
CryptoExercise Security Security This sample demonstrates the use of the two main Cryptographic API sets on the iPhone OS SDK. Asymmetric Key Encryption and random nonce generation is handled through the Security framework API set, whereas, Symmetric Key Encryption and Digest generation is handled by the CommonCrypto API set. The CryptoExercise sample brings both of these APIs together through a network service, discoverable via Bonjour, that performs a "dummy" cryptographic protocol between devices found on the same subnet.
CurrentAddress User Experience MapKit This sample makes use of the CLGeocoder class that provides services for converting your map coordinate (specified as a latitude/longitude pair) into information about that coordinate, such as the country, city, or street. A reverse geocoder object is a single-shot object that works with a network-based map service to look up placemark information for its specified coordinate value. To use placemark information is leverages the MKPlacemark class to store this information.
Custom Animatable Property Graphics & Animation
(Animation)
CoreGraphics Shows how to leverage Core Animation’s timing and rendering callbacks to implement custom animatable properties for CALayer subclasses. This technique is supported whether your CALayer subclass belongs to a UIView or is standalone. Both explicit and implicit animation triggers are demonstrated, as well as basic and keyframe animation types.
Custom Section Titles with NSFetchedResultsController Data Management CoreData "DateSectionTitles" shows how to create section information for NSFetchedResultsController using dates.
Custom View Controller Presentations and Transitions User Experience
(Windows & Views)
UIKit Custom View Controller Presentations and Transitions demonstrates using the view controller transitioning APIs to implement your own view controller presentations and transitions. Learn from a collection of easy to understand examples how to use UIViewControllerAnimatedTransitioning, UIViewControllerInteractiveTransitioning, and UIPresentationController to create unique presentation styles that adapt to the available screen space.
CustomContentAccessibility User Experience UIKit This sample, previously known as WWDCMaps, shows you how to support accessibility in a custom drawing UIView and UIControl, demonstrates how to create an accessibility element for each map item, and implement UIAccessibilityContainer protocol in the container view to interact with iOS accessibility system. The Guided Access Restriction API, which is newly introduced in iOS 7 for restricting functions when Guided Access enabled, is also demonstrated in this sample.
CustomHTTPProtocol Networking & Internet Foundation CustomHTTPProtocol shows how to use an NSURLProtocol subclass to intercept the NSURLConnections made by a high-level subsystem that does not otherwise expose its network connections. In this specific case, it intercepts the HTTPS requests made by a web view and overrides server trust evaluation, allowing you to browse a site whose certificate is not trusted by default.
Customizing UINavigationBar User Experience
(Controls)
UIKit NavBar demonstrates using UINavigationController and UIViewController classes together as building blocks to your application's user interface. Use it as a reference when starting the development of your new application. The various pages in this sample exhibit different ways of how to modify the navigation bar directly, using the appearance proxy, and by modifying the view controller's UINavigationItem. Among the levels of customization are varying appearance styles, and applying custom left and right buttons known as UIBarButtonItems.
DateCell User Experience
(Tables)
UIKit Demonstrates formatted display of date objects in table cells and use of UIDatePicker to edit those values.
DemoBots: Building a Cross Platform Game with SpriteKit and GameplayKit Graphics & Animation SpriteKit DemoBots is a fully-featured 2D game built with SpriteKit and GameplayKit, and written in Swift 2.0. It demonstrates how to use agents, goals, and behaviors to drive the movement of characters in your game, and how to use rule systems and state machines to provide those characters with intelligent behavior. You'll see how to integrate on-demand resources into a game to optimize resource usage and reduce the time needed to download additional levels. DemoBots takes advantage of the Xcode 7 scene and actions editor to create detailed levels and animations. The sample also contains assets tailored to ensure the best experience on every supported device.
DocInteraction Data Management
(File Management)
UIKit Demonstrates how to use UIDocumentInteractionController to obtain information about documents and how to preview them. There are two ways to preview documents: one is to use UIDocumentInteractionController's preview API, the other is directly use QLPreviewController. This sample also demonstrates the use of UIFileSharingEnabled feature so you can upload documents to the application using iTunes and then to preview them. With the help of "kqueue" kernel event notifications, the sample monitors the contents of the Documents folder.
DownloadFont CoreText Demonstrates how to download fonts on demand on iOS 6 and later.
EADemo Data Management
(Device Information)
ExternalAccessory The sample can be used with any Made For iPod (MFI) device designed for use with the External Accessory Framework. The application will display an External Accessory attached device in the Accessories window, provide information registered by the MFI device, and provides methods to send and receive data to the device.
EKReminderSuite EKReminderSuite is a set of sample code that demonstrates how to implement reminders using the EventKit Framework.
Emporium: A Simple Shopping Experience with Apple Pay PassKit This sample shows how to integrate Apple Pay into a simple shopping experience. You'll learn how to make payment requests, collect shipping and contact information, apply discounts for debit/credit cards, and use the Apple Pay button. This project also contains an Apple Watch WatchKit extension that shows you how to start make Apple Pay transactions using Handoff with the NSUserActivity class.
Enumeration Sample Foundation EnumerationSample is a command line project that demonstrates how to implement a class that supports block-based enumeration, fast enumeration, enumeration using NSEnumerator, and subscripting. While provided as a OS X application, the techniques demonstrated by this sample are fully applicable to iOS development.
Example app using Photos framework User Experience Photos A basic Photos-like app which introduces the Photos framework.

- List albums, folders, and moments.

- Display the contents of moments or albums.

- Display the content of:

* A single photo.

* A video; allowing playback.

* A Live Photo; allowing playback via 3D touch gestures or manually via a button.

- Allow the following actions:

* Single-click editing of a photo.

* Creating an album and adding assets to it.

* Re-ordering assets in an album.

* Removing assets from an album.

* Deleting assets and albums.

* Hiding and unhiding assets from moments.

* Favoriting assets.

* Initiating playback of a Live Photo.
Extended Audio File Conversion Test Audio & Video
(Audio)
AudioToolbox Demonstrates using ExtAudioFile API to convert from one audio format and file type to another.
Fit: Store and Retrieve HealthKit Data HealthKit Fit is a sample intended as a quick introduction to HealthKit. It teaches you everything from writing data into HealthKit to reading data from HealthKit. This information may have been entered into the store by some other app; e.g. a user's birthday may have been entered into Health, and a user's weight by some popular weight tracker app. Fit shows examples of using queries to retrieve information from HealthKit using sample queries and statistics queries. Fit gives you a quick introduction into using the new Foundation classes NSLengthFormatter, NSMassFormatter, and NSEnergyFormatter.
Footprint: Indoor Positioning with Core Location CoreLocation Use Core Location to take a Latitude/Longitude position and project it onto a flat floorplan. Demonstrates how to do the conversions between a Geographic coordinate system (Latitude/Longitude), a floorplan PDF coordinate system (x, y), and MapKit.
FourInARow: Using the GameplayKit Minmax Strategist for Opponent AI General GameplayKit This sample demonstrates how to use the GPMinmaxStrategist class to implement a computer-controlled opponent for a simple board game. You'll see how to structure gameplay model code for use with the minmax strategist using the GPGameModel protocol and related APIs.
Fox: Building a SceneKit Game with the Xcode Scene Editor General SceneKit This sample demonstrates how to use the Xcode Scene Editor to build a level in a SceneKit-based game. You’ll see how to choose between the Metal and OpenGL ES renderer, add positional audio triggers, and set up light maps using material properties. This sample supports tvOS and Game Controllers.
GenericKeychain Security Security This sample shows how to add, query for, remove, and update a keychain item of generic class type. Also demonstrates the use of shared keychain items. All classes exhibit very similar behavior so the included examples will scale to the other classes of Keychain Item: Internet Password, Certificate, Key, and Identity.
GeocoderDemo: Uses CLGeocoder for forward and reverse geocoding Data Management CoreLocation This sample application demonstrates using a CLGeocoder instance to perform forward and reverse geocoding on strings and dictionaries. The application also includes an example distance calculator that will display the distance between two placemarks.
Get Battery Status Data Management
(Device Information)
UIKit Demonstrates the use of the battery status properties and notifications provided via the iOS SDK.
GKAchievements General GameCenter Abstract: Provide an example of how to successfully submit achievements and store them when submission fails.
GKAuthentication General GameCenter An example of how to successfully authenticate using GameKit.
GKLeaderboards General GameCenter GKLeaderboard is a sample application that shows how to correctly submit a score and view them using GKLeaderboardViewController.
GKTapper GameCenter GKTapper is a sample application that shows how to support GameCenter Leaderboards and Achievements. It also demonstrates using GKLeaderboardViewController and GKAchievementViewController to display this data.
GLAirplay User Experience OpenGLES Demonstrates how to provide a richer experience to your users when they are using Airplay by displaying your UI on the iPhone/iPad and your app/game contents on the second display.
GLCameraRipple Audio & Video
(Video)
AVFoundation This sample demonstrates how to use the AVFoundation framework to capture YUV frames from the camera and process them using shaders in OpenGL ES 2.0. CVOpenGLESTextureCache, which is new to iOS 5.0, is used to provide optimal performance when using the AVCaptureOutput as an OpenGL texture. In addition, a ripple effect is applied by modifying the texture coordinates of a densely tessellated quad.
GLEssentials Graphics & Animation
(3D Drawing)
OpenGLES This sample provides an example of some essential OpenGL functionality. There are usages of Vertex Buffer Objects (VBOs), Vertex Array Objects (VAOs), Framebuffer Objects (FBO), and GLSL Program Objects. It creates a VAO and VBOs from model data loaded in. It creates a texture for the model from image data

and GLSL shaders from source also loaded in. It also creates an FBO and texture to render a reflection of the model. It uses an environment mapping GLSL program to apply the reflection texture to a plane. This sample also demonstrates sharing of OpenGL source code between iOS and OS X. Additionally, it implements fullscreen rendering, retina display support, and demonstrates how to obtain and use an OpenGL Core Profile rendering context on OS X.
GLGravity Graphics & Animation
(3D Drawing)
OpenGLES The GLGravity sample application demonstrates how to use the UIAccelerometer class in combination with OpenGL rendering. It shows how to extract the gravity vector from the accelerometer values using a basic low-pass filter, and how to build an OpenGL transformation matrix from it.
GLImageProcessing Graphics & Animation
(3D Drawing)
OpenGLES The GLImageProcessing sample application demonstrates how to implement simple image processing filters (Brightness, Contrast, Saturation, Hue rotation, Sharpness) using OpenGL ES1.1. The sample also shows how to create simple procedural button icons using CoreGraphics.
GLPaint Graphics & Animation
(3D Drawing)
OpenGLES The GLPaint sample application demonstrates how to support single finger painting using OpenGL ES. This sample also shows how to detect a "shake" motion of the device. By looking at the code you'll see how to set up an OpenGL ES view and use it for rendering painting strokes. The application creates a brush texture from an image by first drawing the image into a Core Graphics bitmap context. It then uses the bitmap data for the texture.
GLTextureAtlas Graphics & Animation
(3D Drawing)
OpenGLES This sample demonstrates how to use a texture atlas to draw multiple objects with different textures simultaneously using OpenGL ES. The application uses a texture atlas in the PVR format. By adding in degenerated triangles, and compute 3D transformations ourselves using matrices, we are able to collapse all the draw calls into one.
Handling Touches Using Responder Methods and Gesture Recognizers Data Management
(Event Handling)
UIKit This sample contains two applications that demonstrate how to handle touches, including multiple touches that move multiple objects: "Touches_Responder" demonstrates how to handle touches using UIResponder's: touches began, touches moved, and touches ended methods. "Touches_GestureRecognizers" demonstrates how to use UIGestureRecognizer objects to handle touch events.
HazardMap MapKit Demonstrates how to create a custom Map Kit overlay to display USGS earthquake hazard data. It shows how to create a custom Map Kit overlay and corresponding view to display USGS earthquake hazard data on top of an MKMapView.
HeadsUpUI User Experience
(Windows & Views)
UIKit Demonstrates how to implement a Heads Up or HUD-like user interface over the app's primary view controller. This essentially mimics the behavior of the MPMoviePlayerController's hovering controls for controlling movie playback. Developers can refer to this sample for best practices in how to implement this translucent kind of interface complete with animation and timer support.
HelloGoodbye: Using the Accessibility API to Widen Your User Base UIKit This project shows you how to use the Accessibility API to widen your user base. It demonstrates how you can adjust your user interface when a user has Bold Text, Reduce Transparency, Darken Colors, or Reduce Motion enabled. It also contains examples of API you can implement to allow a VoiceOver or Switch Control user to interact with your app.
HomeKit Catalog: Creating Homes, Pairing and Controlling Accessories, and Setting Up Triggers General HomeKit HomeKit Catalog demonstrates how to use the HomeKit framework. Use this project as a reference for interacting with objects and performing common tasks such as creating homes, pairing with and controlling accessories, and setting up triggers to automate actions.
iAdInterstitialSuite iAd iAdInterstitialSuite contains two applications that demonstrate the usage of the ADInterstitialAd introduced in iOS 4.3.
iAdSuite with Storyboards User Experience iAd iAdSuite is a set of samples demonstrating how to manage an ADBannerView in many common scenarios, each scenario demonstrated in a particular sample application.
Inter-App Audio Examples Audio & Video
(Audio)
AudioUnit This suite of samples includes three projects that together illustrate Inter-App Audio feature.
Internationalization and Localization for iOS Data Management
(Strings, Text, & Fonts)
UIKit Drawing from the existing Cocoa Internationalization Mountains sample, this sample shows how to integrate, design and programmatically access localized resources and data in an iOS application. This sample uses multiple localized views, localized formatted strings, localized application data, localized info.plist strings, and a localized application preferences settings bundle. The sample is localized in three languages: English, French, and Traditional Chinese.
iPhoneCoreDataRecipes Data Management CoreData This sample shows how you can use view controllers, table views, and Core Data in an iPhone application.
KeyboardAccessory User Experience UIKit Shows how to use a keyboard accessory view.
KeychainTouchID: Using Touch ID with Keychain and

鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
firebase/firebase-ios-sdk: Firebase iOS SDK发布时间:2022-06-21
下一篇:
apollographql/apollo-ios: 发布时间:2022-06-21
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap