Twitter | Search | |
Kyle Howells
iOS Developer since iOS 3 🏂🧗🏻‍♂️📷
101,327
Tweets
251
Following
4,662
Followers
Tweets
Kyle Howells 20h
Replying to @Freerunnering
I wrote the example project as the companion project for a blog post, but I'll do the writeup itself and the ObjC example project another time.
Reply Retweet Like
Kyle Howells 20h
Reading and writing image metadata with CoreGraphics is actually really simple. 2 lines of code to read, and 5 to write. I built a simple example app to allow viewing and editing all the metadata.
Reply Retweet Like
Kyle Howells 22h
The advantage of not using external dependencies. I'm targeting a roughly 0.5-2MB install size with customised UI support for iPhone, iPad and macOS. I'm not really sure why I would, but in theory I could fit my entire app into an App Clip almost 10 times.
Reply Retweet Like
Kyle Howells 22h
Replying to @MarkVillacampa
Just curious what format are those 360's in? At the moment Spherium displays Equirectangular images, but doesn't convert fisheye images directly from camera. Though I'd like to explore that in the future if nothing else so I'm personally not tied to the vendors processing apps.
Reply Retweet Like
Kyle Howells 22h
Replying to @MarkVillacampa
At the moment I'm getting away with using UIKit and SceneKit. For zooming arbitrarily into arbitrarily large images and taking arbitrary snapshot of those images I'd probably need to write my own custom metal renderer and image loading code.
Reply Retweet Like
Kyle Howells 22h
Replying to @MarkVillacampa
Part one: just downscaling and loading it without crashing (which I know how to) Part two: progressively loading in higher res versions of bits of the image as you zoom in on a part of the image. That part will require more research then is probably worth investing in this app.
Reply Retweet Like
Kyle Howells 22h
Replying to @MarkVillacampa
One of the first people to try the app managed to load the 555MB, 43200 × 21600 BlueEarth NASA 360 onto their iPad. Needless to say that crashed the app. Though I do plan to, and know how I can, support that in the future. Even then it'll be two stage.
Reply Retweet Like
Kyle Howells 22h
Replying to @MarkVillacampa
Which is just over 8000px width for older iOS devices, and just over 16k for macs and newer iOS devices. Anything larger gets downscaled. Anything REALLY large will probably crash the app.
Reply Retweet Like
Kyle Howells 22h
Replying to @MarkVillacampa
Currently it only supports reading from the photo library. When I release the mac app one of the updates I'll probably need to add is importing other files into it. I'll also then need to tackle really large files. Currently it downscales anything bigger than the GPU can handle
Reply Retweet Like
Kyle Howells 22h
Just started writing a tutorial on read/writing image file metadata with CoreGraphics and want to have a text editable NSDictionary in a text view so I looked up the documentation for NSPropertyListSerialization and *sigh*...
Reply Retweet Like
Kyle Howells 22h
Replying to @MarkVillacampa
😂 the number 1 reason for this is simple. I don't use package managers or external dependencies. I just write all the code myself. It's currently about 6000 LOC, it also supports macOS (via Catalyst) but there's no TestFlight for macOS so 🤷‍♂️
Reply Retweet Like
Kyle Howells retweeted
Sean Heber Jul 8
I strongly suspect that organizing code into hierarchies (modules, subclasses, folders) is just a sneaky form of premature optimization.
Reply Retweet Like
Kyle Howells retweeted
Steve Troughton-Smith Jul 9
Replying to @stroughtonsmith
Here's a multi-column NSTableView embedded in a UIKit app using _UINSView. If this API becomes public next year or so, there really will be a path from AppKit to UIKit for developers who choose to take it
Reply Retweet Like
Kyle Howells retweeted
Steve Troughton-Smith Jul 9
Using _UINSView to host AppKit views in Catalyst is trivial, and it routes input events properly!
Reply Retweet Like
Kyle Howells retweeted
Scott Manley Jul 9
Now you can use the same hardware used by faceId to do local for your games; Live Link Face iOS App now available for real-time facial capture
Reply Retweet Like
Kyle Howells retweeted
Rico Becker Jun 29
Apple has restricted access to `~/Library/Containers/` in Finder on macOS Big Sur. It’s only showing one folder in my case. In Terminal I can see that everything is still there. Any way to reactive the normal behavior?
Reply Retweet Like
Kyle Howells retweeted
Steve Troughton-Smith Jul 8
Prototyping onboarding screens with SwiftUI; too much implicit behavior and things I don't have control of to make me want to rely on this in shipping apps. It also took me three hours, even if it is only a hundred lines of code 😂
Reply Retweet Like
Kyle Howells retweeted
Patrick Wolowicz Jul 7
Just discovered for creating styled App Store screenshots and it's awesome!
Reply Retweet Like
Kyle Howells Jul 8
Replying to @Freerunnering
I'll be writing a short blog post about how to read and write image file metadata tomorrow. It was surprisingly hard to Google for the answer given it turns out it's really easy. UIImage has no metadata support, but CoreGraphics can read and write it to and from an NSDictionary.
Reply Retweet Like
Kyle Howells Jul 8
This feature is actually the reason I made this app. I wanted to view my 360 photos and iOS doesn't support 360's, but I also want to extract normal photo style snapshots from them with the correct original timestamp so they don't end up making a mess of my photo library.
Reply Retweet Like