I immediatly put it for a spin, and it is really fantastic. I have been using the desktop version for a while to produce nice app icons for my own prototype projects. So of course I wanted to try if this works nicely on the iPad as well. My test subjects were the following 3 icons:
In case you were wondering, these are already the result as nice SVG export. It took me just few hours to produce them, all on iPad and half of it on a train ride. I intentionally put in a more nostalgic icon with more details to have more variance.
I created this hopefully helpful artboard (which has the examples as turned off layers in them) for your free use: Monkeydom iOS App Icon Template v1.afdesign
Note the v1 as I probably haven't used all the features to the max yet, and I might update that here. However, with the symbol feature, pixel preview, export persona, layer effects, easy grouping and masking, great bezier tools, it is an powerful template for me already.
One sore spot is, that the "global color" feature is missing from the iPad version currently, which I tend to utilize a lot to be able to adjust my color schemes after the fact.
tl;dr: I like gector graphics, I hate most vector packages, I've used an Acorn for far too long and Affinity makes me very happy.
Vector graphics and system support thereof have been dear to my heart since the beginning of my personal computing time. I started out using CorelDraw on Windows 3.1 in the early nineties, which introduced me to the subject, but otherwise was not a great experience.
After my short Windows/PC exposure I took the rather unusual turn to the Acorn Archimedes and Acorn RiscPC. RiscOS, its operating system, has system wide vector graphics support. This includes a data format, as well as anti aliased vector fonts. It was put to great use in the desktop publishing and word processing products on that platform, which were way ahead of the curve for a very long time.
What blew me away though, was Computer Concepts's Vector graphics package, ArtWorks. Their raw speed, live-antialiasing, and simple direct manipulation interface (as far as I know this is where the now standard direct line for gradients came from). Sadly even before the demise of Acorn, this package was discontinued (interestingly over the lack of a great C++ compiler). It went on to be PC only as Xara, then was bought by Corel to become CorelXara and later Xara again. The technology is still great, but it was mangled through so much marketing hotchpotch (just look at the current website) and windows interface paradigms, that it's quite a mess.
So I stayed with my little RiscPC for as long as I could stand it (2000), and eventually I moved to the Mac with the first white iBook. And while MacOS X was slow as a dog at that point, the deep integration of fully anti aliased Fonts and Quartz2D/PDF were so promising to me (and I was doing web development, so a unix on your laptop didn't hurt either) that I made the jump. Although in the early 10.1 times I did use the iBook more as a server for my web development driven by my RiscPC.
And while there was a great charting tool with OmniGraffle, somehow, nobody really took up the bait to provide a real alternative to the IMHO still horrible user experience of Adobe Illustrator. Even the more bearable tools like Freehand did get killed over time.
For a while I used Lineform and tried to be content with it, sadly Freeverse did not really maintain it and it died. Sometimes there were other short term contestans. Most of the apps never really got anywhere, or had humongous bugs and speed problems with big documents, etc. The last one I could slightly stand before Affinity Designer was Sketch. However, I did have my fair share of fighting with the interface and bugs there too.
This is why it makes me unbelievably happy, that Serif plays the long game, and that it created an engine that doesn't make me afraid of working with my apps, or wait to a response. And while I'm not in print or professional asset creation, these suite of apps (and I'm looking forward to Publisher), is what drove my private creativity with computers, and helped me generate semi-professional to professional looking things.
With the fantastic displays and devices, and these apps, I'm having a blast doing that again. This for me, is part of the joy of using computers, when they just are a great tool to help me create what I want to create, and couldn't without them.
During my life as a software engineer there only have been a few occasions where the design and principles of a language struck me as inherently beautiful and consequently inspired me to work with them.
- Lua for its simplicity, modularity and clear design goals that are followed through.
- Objective‑C / Cocoa for its very simple object oriented extension to C, with a great tradeoff of being a consistent and clear while still having optimal inter-op to the existing world of C and C++. And of course because of Cocoa and its design principles, purpose and empowerment.
- Erlang/OTP for being the most powerful implementation of the actor model, its concept of lightweight processes, and its ease to build distributed system. And due to being purely functional, the eye opening benefit of always having all the state necessary to analyze and fix any crash.
- Ruby (on Rails) for being an going all in, everything is an object, dynamic, object oriented scripting language. Rails for adding fantastic model view controller abstractions on top of it, and for such great use of the dynamism of the language. Amongst other parts, I'm still in awe of Builder for the XML view part.
None of those languages / language framework combos are perfect. But they all have one thing in common: A well defined set of carefully chosen design goals that are tailored towards certain usage scenarios. And those goals give guidance when using it, move you towards consistency and greatness when writing in and for them.
So now contrast that to Swift. First of all: Which question did it desire to answer? Think about it. There is no one clean answer. It just wanted to be better, more modern, the future – the one language to rule them all. A first red flag for anyone who ever tried to do a 2.0 rewrite of anything. From the outset, it wanted to be a silver bullet:
- It should scale from App/UI language down to system language.
- It should inter-op with existing Foundation and Cocoa, but still bring its own wide reaching standard library, adding a lot of duplication.
- It is functional, object-oriented and protocol oriented all at the same time.
- It wants to be strongly typed, but at the same time so convenient in type inference, that it falls over its own feet while trying to grasp simple expressions, and they become too complex to manage. By design.
- It is compiled and static, but emphasized the REPL and playground face that makes it want to look like a great scripting solution. Which it isn't.
- It seems to have been driven by the needs of the compiler and the gaps that needed to be filled for the static analyzer. Those seem to have been super-charged instead of catering to app developer's actual needs: efficient, hassle free, productive (iOS) App development.
- It is meant to offer progressive disclosure and be simple, to be used in playgrounds and learning. At the same time learning and reading through the Swift book and standard library is more akin to mastering C++. It is quite unforgiving, harsh, and complex.
On top of that it chose to be opinionated about features of Objective‑C, that many long time developers consider virtues, not problems:
- Adding compile time static dispatch, and making dynamic dispatch and message passing a second class citizen and introspection a non-feature.
- Define the convenience and elegance of nil-message passing only as a source of problems.
- Classify the implicit optionality of objects purely as a source of bugs.
At the same time it failed to attack, solve and support some of the most prominent current and future computing problems, which in my book all would be more important than most of the areas it tries to be good in:
- overall API interaction complexity
- actual App and UI development
- developer productivity
It keeps defering the big wins to the future while it only offered a very labour intensive upgrade path. Without a steady revenue stream, many apps that would have just compiled fine if done in Objective‑C, either can't take advantage of new features of the devices easily, or had to be taken out of the App Store alltogether, because upgrading would be to costly. If you are working in the indie dev-scene, you probably know one of those stories as well. And while this is supposed to be over now, this damage has been done and is real.
On top of all of this, there is that great tension with the existing Apple framework ecosystem. While Apple did a great job on exposing Cocoa/Foundation as graspable into Swift as they could, there is still great tension in the way Swift wants to see the world, and the design paradigms that created the existing frameworks. That tension is not resolved yet, and since it is a design conflict, essentially can't be resolved. Just mitigated. From old foundational design patterns of Cocoa, like delegation, data sources, flat class hierarchies, over to the way the collection classes work, and how forgiving the API in general should be.
If you work in that world you are constantly torn between doing things the Swift/standard-library way, or the Cocoa way and bridging in-between. To make matters worse there are a lot of concepts that don't even have a good equivalent. This, for me at least, generates an almost unbearable mental load. It leads to writers block and/or running around in cognitive circles trying to answer questions on how to best express the problem, or make Swift be happy with my code.
To add insult to injury, all this attention is taken away from solving and focussing on the actual problem: writing a great app or framework that is a joy to use. Whereas Objective‑C/Cocoa always strived for maximum developer productivity and purpose.
Yes, Swift code might end up being more correct in the end. It also might alert you to edge cases early on. However, the flip side is it inhibits your creativity while writing. When starting out to program, the way I enjoy working, I don't yet know how the API is best expressed. So I try out different ways to express it until I find a sweet spot. Then I go back and unify accordingly.
Swift actively distracts me in that endeavor by making me answer questions I really don't want to answer right now. Yes, stuff might be less correct in the meantime, but heck that is what I want during the design phase. Find my concept, sweet spot, iterate, pivot quickly.
So my opposition to Swift is very deep – on a fundamental design level. I see it as the devil on your shoulder, always fighting for your attention away from your problem domain, back to how everything is completely correct, or more swifty. And at the same time it is very unforgiving towards bigger change - ever tried to switch code from objects back to structs, or vice versa?
Objective‑C had and has nice progressive disclosure there, you can start out to your prototyping and then slowly get help from the tools to refine it when you found your sweet spot. Turn on more warnings, run the static analyzer, optimize hot spots towards C or C++.
It is 4 years in now. It is not too late to pivot and take everything that has been learned and form that into the developer experience in an evolution of Objective‑C that really caters to the goals and ideas of the platform. In my opinion, a lot of the "lofty goals" haven't been achieved, and as discussed, should even be non-goals. Just imagine a world where Objective‑C would have gotten the same amount of drive and attention Swift got from Apple? It is not a big leap to see that everyone would be better off right now. Swift just ended up being a jack of all trades, master of none. (I defer my own ideas for a Objective‑C 3.0 evolution to a potential later blog post, as this is nuanced as well.)
I would love to see a pivot like that, but to be honest, it does seem quite unlikely at this point. That's not to say that an ecosystem with swift at the heart will not produce something one can be productive and content with. However, for me at least, it definitely does not fall into the delightful category of the language/framework combo's I mentioned at the beginning. This, to me, is as a sad and unnecessary regression.
So given my clear opposition to Swift, where does my personal journey go now? Luckily there are a few promising avenues to explore:
- Rust – a strongly typed language with a clear direction and purpose. I don't like the strong type system world very much in general, but for lower level hard problems it clearly serves a purpose. I both want to get a clear grasp of those benefits, and wander more in the lower level world.
- Elixir/Phoenix – this looks like the perfect mind meld of rails and erlang to me. Hope it is as good as it sounds.
- I won't leave the Apple eco-system completely. Even with all their flaws, Macs and iOS devices are still and will be best of breed in many ways. However, I will have a strict no-Swift policy on any of my own future software projects and hope that this is still feasible.
In essence, the current free 2 play model equals a slot machine in a bar. It preys on the poor people who get hooked, tries to hook them as quickly as it can, and in pure business logic, tries to get as much out of them as possible.
Where insult comes to injury is if you look at the App Store revenue numbers. No matter when you look at the top grossing lists, the first 3-10 titles are free 2 play titles. So most of our bars have turned into casinos by now already, trend rising. That is very shortsighted, bad for customers, bad for Apple and bad for the industry.
And for me it just boils down to: I don't want to build slot machines.
To me, all of this seems like typical geek behaviour: something is making them uncomfortable, and so they attack it on “rational” grounds.
If arguments heat up, it is always worth thinking about if the current discussion warrants the intensity. If it doesn't, then looking into exactly what is causing the discomfort in you usually turns out to be insightful.
Drew Crawford writing about Garbage Collection:
What this chart says is “As long as you have about 6 times as much memory as you really need, you’re fine. But woe betide you if you have less than 4x the required memory.”
Really long article, greatly done. A must read for anyone in the mobile field. (via daringfireball.net)