Archive for the ‘Various Frustrations’ Category

The festering realities of Bluehost: In which I learn about “unifiedlayer.com”

Sometimes my outgoing email bounces for reasons I don’t understand from a variety of recipients. I usually try to contact the postmaster to find out why. This weekend, I actually got a response from one:

My apologies for the delay in replying.  Your email went into the gmail spam folder and so was not forwarded out to where I could respond immediately.

Going directly to “spam”? That’s not good.

The postmaster continued:

The reason that your email is blocked is because it originated at unifiedlayer.com. Unifiedlayer is one of the worst spam originators. They host spammers and they really don’t care, so I don’t have much choice but to block many of their mail servers.

Finally, some concrete information. I searched for “unifiedlayer”, finding common searches like “is unifiedlayer unsafe” and “unifiedlayer spam”. Go ahead and do those searches yourself. You’ll find that overall trust in unifiedlayer-originated mail is somewhere up there with body cavity searches, STDs, and politicians.

So I did what pretty much anyone would, I called my service provider. Bluehost told me that unifiedlayer was an in-house product, that they were well aware of the spam problem, that they worked on it really really hard (that’s a paraphrase, not a quote), and like every other thing that Bluehost gets wrong (and gets wrong repeatedly), that if I were just willing to pay a tiny bit more per month (five bucks in this case), they’d allow my “ericasadun” domain email to go through a different originator.

I am so sick of Bluehost.

If you have any advice on how I can transfer my web site and my email away from this festering heap, please drop me an email (I’ll probably get yours even if you don’t get my reply) and help me find an alternate home. I’ve heard good things about Digital Ocean, for example, but I don’t even know where to start in terms of moving over ten years worth of email.

At least I’ve been through the process of reinstalling WordPress and have my backups.

Thanks in advance.

Auto Layout, Playgrounds, and Xcode

Today, someone asked what the easiest way was to center a view (specifically a UIImageView) inside a parent view with minimum offsets from the top and sides.

Because you’re working with image views, it’s important that you first set the content mode. For insets, it’s almost always best to go with a “fitting” aspect-scale, which preserves the entire picture even if it has to pillarbox or letterbox. (Pillarboxing adds extra spacing to each side for tall images; letterboxing adds the top and bottom for wide ones.)

// set content mode
imageView.contentMode = .scaleAspectFit

Make sure your view can squish its content by lowering its compression resistance:

[UILayoutConstraintAxis.horizontal, .vertical].forEach {
    imageView
        .setContentCompressionResistancePriority(
            UILayoutPriority(rawValue: 100), for: $0)
}

You must preserve your image’s aspect ratio. Calculate your target ratio by dividing its width by its height:

let size = imageView.image?.size ?? CGSize()
let aspectRatio = size.width / size.height

Add strong constraints that preserve the aspect, and make the view smaller than its parent using inset values you supply:

let insets = CGSize(width: 20.0, height: 32.0)

let constraints = ([
    // Preserve image aspect
    imageView.widthAnchor
        .constraint(equalTo: imageView.heightAnchor, multiplier: aspectRatio),
 
    // Make view smaller than parent
    imageView.widthAnchor
        .constraint(lessThanOrEqualTo: parentView.widthAnchor,
                    constant: -insets.width * 2),
    imageView.heightAnchor
        .constraint(lessThanOrEqualTo: parentView.heightAnchor,
                    constant: -insets.height * 2),

    // Center in parent
    imageView.centerXAnchor
        .constraint(equalTo: parentView.centerXAnchor),
    imageView.centerYAnchor
        .constraint(equalTo: parentView.centerYAnchor),
])

If you want to be super cautious, keep the aspect and two center constraints at 1000 and bring the width and height ones down to 999. You can install the constraints as you create them but I prefer to break that part out so I can tweak the priorities for each constraint group:

constraints.forEach {
    $0.priority = UILayoutPriority(rawValue: 1000)
    $0.isActive = true
}

I always mess up with the signs (positive or negative) of the constants. It helps to test these out in a playground rather than going by memory because the signs aren’t always intuitive. Even better, write routines that automates your Auto Layout tasks because if you debug once (and add proper tests), you never have to think about it again.

Mac playgrounds are inherently superior to iOS ones as they don’t run a simulator and are faster and more responsive. That is to say, you don’t have to quit and relaunch Xcode quite so often. If you are debugging iOS layouts though, and your playground hangs when starting the simulator or running your code, learn to quit, kill the core simulator service, and relaunch Xcode. It will save you a lot of time.

I have a one liner in my utilities to deal with the service:

% cat ~/bin/simfix
killall -9 -v com.apple.CoreSimulator.CoreSimulatorService

Most of the time a single quit, simfix, and launch will get things moving again with recalcitrant iOS playgrounds. Be aware that malformed format strings and other auto layout issues won’t produce especially helpful feedback, especially compared to the runtime output of compiled projects. If you know what you’re doing, you can set up a simple layout testbed in playgrounds with less overhead and time than, for example, a one view project but at a cost.

Stick to projects, and avoid playgrounds, when testing even mildly complex code-based layouts. You cannot, at this time (at least as far as I know), control which simulator will be used in the playground so it’s not great for checks on a multitude of simulator geometries. The tl;dr is that playgrounds work for general layout routines. Prefer compiled projects for specific tasks.

In which I get hacked, Part 6

Last weekend, Bluehost closed down my site. After spending significant time on the phone with support, I came to the conclusion that I needed to nuke the entire site down to the ground. The WordPress install was simply too corrupt to continue or repair.

Since my secure shell access was revoked at the time, I used their control panel to entirely remove my public_html folder. They ran a scan on my account, found no further malware, and allowed me back in.

To recover, I re-installed a fresh copy of WordPress using Bluehost’s control panel tools. I then used CyberDuck (for sftp) and secure shell to upload my wordpress database and image uploads. That’s the site you’re reading today.

I reverted my theme back a few years to a version I knew was safe. I use  a customized version of the open source Frank theme. Rather than pull down a new copy, I wanted to keep my tweaks that supported the ads on the right side of the screen. They don’t produce much money but they help offset the hosting costs involved in running this blog..

I also  installed the following plug-ins, some old, some new:

  • ActivityLog: “Get aware of any activities that are taking place on your dashboard! Imagine it like a black-box for your WordPress site. e.g. post was deleted, plugin was activated, user logged in or logged out – it’s all these for you to see.
  • Really Simple SSL: “Lightweight plugin without any setup to make your site SSL proof
  • WP fail2ban: “Write all login attempts to syslog

And on a less security note:

  • oEmbed Gist: “Embed source from gist.github.
  • WP to Twitter: “Posts a Tweet when you update your WordPress blog or post a link, using your URL shortener.

Most importantly, I use Updraft Plus: “Backup and restore: take backups locally, or backup to Amazon S3, Dropbox, Google Drive, Rackspace, (S)FTP, WebDAV & email, on automatic schedules.” 

My daily database backups and my weekly upload backups (only for the current year, I already have backups for previous years) ensured I could get my site back up and running within hours of the most recent hack.

I still hate WordPress. I still wish I could run a static site and get comments and other great stuff in one convenient package. However, WordPress does the job I need it to do. It’s simple to write posts and interact with you.

My website is all about this connection. I don’t do any e-commerce. It’s basically a passion project rather than anything I do for business related reasons. I like having somewhere I can get thoughts out of my head and share them with other people. Beyond that, I don’t really have any important agendas and I don’t have the time in my life to perfect my security or delivery tools.

I want to thank everyone who sent me feedback of encouragement and support during my latest hack. I appreciate the comments and the suggestions. I now have a great list of static solutions (including github.io and DNS redirect) to fall back to if I must. Yes, I’m sticking with the crappiest solution right now. I’m doing so because it’s the path of least resistance and not because I don’t prefer your suggestions.

For those with more time and more investment, the popular consensus seems to be using Jekyll/github.io with disqus comments. Other suggestions included Hugo (gohugo.io), GetGrav (getgrav.org, “No Ruby, supports comments, fun to play with”), Ghost (ghost.org), and AWS Lightsail.

I don’t know why anyone would want to hack my nothingburger of a site but I’m glad I have friends out there who helped when they did.

Tap tap, hey is this thing on?

tl;dr: Erica’s site gets hacked repeatedly. Erica’s account is closed by Bluehost. Erica wails into the void. Played with DNS, with github.io, nuked wordpress install, re-installed wordpress, re-installed data, reinstalled plugins, scanned for malware, attempted to restore DNS, wailed into void, some semblance of site possibly restored. Maybe.

postscript: I’m posting this as a test to see if my site is back and alive. If so, please make sure to use https and not http to connect. Fingers crossed.

 

Bitpocalypse Now

The bitpocalypse is nigh.

If you can afford to say goodbye, now’s the time to use those 32-bit un-installers while they still run.

Apple writes:

With the recent release of macOS High Sierra 10.13.4, the first time users launch an app that does not support 64-bit they will see an alert that the app is not optimized for their Mac.

As a reminder, new apps submitted to the Mac App Store must support 64-bit, and starting June 2018, app updates and existing apps must support 64-bit. If you distribute your apps outside the Mac App Store, we highly recommend distributing 64-bit binaries to make sure your users can continue to run your apps on future versions of macOS.

You can read more about the macOS 64-bit transition on Apple’s dedicated support page.

To find out what applications will be affected, use file from the command line on “/Applications/*.app/Contents/MacOS/*”. This should give you a good idea indicating which apps are about to die on you.

If you have apps in folders (like Adobe and Microsoft do), you may need to perform several sweeps, adjusting the path to include subfolders and bundled utilities.

My copy of QuickTime Player 7, with its bundled QuickTime Pro features, is on the death list:

QuickTime Player 7.app/Contents/MacOS/QuickTime Player 7: Mach-O executable i386

This makes me immensely sad as I regularly use QTPro to perform video edits from trimming and masking to watermarking and generating image sequences. I do not know of any replacement and would love recommendations.

Microsoft Office 2008 will soon be dead as well. It’s a powerful suite of tools, especially Excel, and I will be bitterly missing its functionality. Numbers and Pages just do not compare to the publishing-standard features offered in the hideously ugly but tremendously functional suite. I do not intend to sign up for a yearly subscription. I may do something with Open Office but I’ll probably stick to Apple and hate doing so.

Microsoft Office 2008/Microsoft Word.app/Contents/MacOS/Microsoft Word: Mach-O universal binary with 2 architectures: [ppc:Mach-O executable ppc] [i386]

Adobe Creative Suite 4 will also die, along with it my most comfortable set of photo editing and vector tools:

Adobe Photoshop CS4/Adobe Photoshop CS4.app/Contents/MacOS/Adobe Photoshop CS4: Mach-O universal binary with 2 architectures: [ppc:Mach-O executable ppc] [i386]

Here too, I have no intention of signing up for a yearly subscription. There are many editing alternatives. They all lack the comfort and familiarity of the tools I know. I don’t want to invest either the money or the time to get up to speed on something else. I may buy a copy of Elements and live with how limited it is. I will probably have to give up Illustrator and Acrobat (the good one, not the reader) entirely.

These three sets of tools (QuickTime Pro, Microsoft Office, and Adobe Creative Suite) all represent a significant investment of know-how and power. None of them were beautiful but they are all part of my current toolbox. I’m floundering around trying to figure out which of my systems I can keep installations on, so I can boot in and use them as needed once the transition happens.

For now, I’m going to get High Sierra running on an external drive and port apps I need but will otherwise lose. It’s not a great solution but it may buy me some time.

What apps you’ll be losing affect you the most and how do you plan to deal with them?

Writing Swift: Adventures in Compiler Mods

Ever since Swift adopted the “implement first, propose second” rule, some contributors to the Swift Evolution process have felt sidelined and dropped out of the community. It’s frustrating having ideas for positive language changes and not being able to implement those changes.

Despite expertise in Swift and Objective-C (not to mention any number of other languages), they like me may not be proficient in C++ and Python, the core tools of Swift language implementation. To date, my code contributions to Swift have been extremely low level.

I think I fixed a comment, added a string, and maybe one or two other tiny tweaks. (I did work on the Changelog file a while back but that is written in Common Mark, and does not involve programming in the slightest.)

I’ve wanted to be able to build what I can dream. And I’ve slowly been diving into the compiler in recent months to see what it takes to build something new. With quite a lot of hand holding from John Holdsworth, I implemented a couple of build directives to test whether (1) asserts can fire and (2) a build is optimized.

What is Debug?

Answering “what does ‘debug’ mean?” was a harder question than I initially thought. Coming as I do from Xcode, where ‘debug’ means a scheme I select from within the IDE, it took a bit of thinking and advice to think about ‘debug’ from a platform independent viewpoint. After going back and forth on the Swift Evolution email list and later the forums, the consensus centered on the two tests I mentioned above: assertions and optimization.

For many projects, a typical debug build is unoptimized where asserts can fire. As projects move into the beta process, that mindset changes. Many in-house and beta builds meant for wider use need optimization.

The state of the art uses the custom conditional compilation flags set with -D . This approach decouples the meaning of ‘debug’ (or for most developers 'DEBUG') from anything in-language that can be decoupled from build settings and persist through source code. Assert configurations have their own flag, -assert-config <#value#>.

Introducing these two tests lets you align your not-for-release code to assert states and/or optimization states:

#if !configuration(optimized)
    // code for un-optimized builds only
#endif

#if configuration(assertsWillFire)
    // code where assertions can fire
#endif

The proof-of-concept implementation coupled with a proposal means I may be able to submit a more substantial and meaningful contribution to the language.

Going Solo

Pushing forward, I wanted to test myself and check whether I could make changes on my own, even if that solo journey was quite small. I started by attempting to build the #exit directive that was being discussed on the forums. This turned out to be a little more complicated than I was ready for.

Among other challenges, #exit may or may not be embedded in an #if build configuration test. Using #exit should allow all material up until that point to be compiled into the build while gracefully excluding all material after. I didn’t know how to check whether the directive was embedded in a condition and how to properly complete the condition (#endif) while discarding remaining text. It was, at my stage of the journey, a step too far.

I put my first attempt to the side and tried something else. I tried to push a scope using with(value) { }, so the material within the scope was native to value. That too proved too difficult without assistance although I am beginning to understand how Swift creates and manages its scope. It was a programming failure but a learning success.

Two projects abandoned, I knew I had to pick something very easy to work with. Although I would have loved to have picked up and run with Dave DeLong’s context pitch (which is discussed here) , I recognized that I needed to bite off something smaller first. So I decided to add a #dogcow token that produces the string value `”????????”` in source. How difficult could that be, right?

About five hours and edits to twenty-one files later, I had it working. Kind of. Because I ran into one of the many frequent headdesk situations that plague Swift compiler development. I had focused on my edits to the most recent repo without rebuilding the supporting tool suite.

Ninja Builds

A ninja build is a quick way to build just the compiler component. But at some point you can’t ninja your way into an entire working toolchain. I couldn’t test my changes until I rebuilt everything, a process that can take many many hours on my Mac:

% ./swift ~/Desktop/test.swift
:0: error: module file was created by an older version of the compiler; rebuild 'Swift' and try again: /Volumes/MusicAndData/BookWriting/LiveGithub/Apple/appleswift/build/Ninja-ReleaseAssert/swift-macosx-x86_64/lib/swift/macosx/x86_64/Swift.swiftmodule

Argh.

Building the compiler is not a quick thing. Even a ninja build is a non-trivial investment of time. So if you want to be completely honest, the total coding and build time a lot longer than just five hours. A lot longer.

Make sure you’ve updated your repo, incorporated the latest changes for swift and all of its support tools, and built them all before working on your branches. It will save you a lot of frustration.

Be aware that in the time it takes to create even a small project, you’ll probably be out of date with master. Hopefully this won’t affect your changes, and the files you think you’re patching are still the right files you should be patching.

Designing My Changes

The best way to add anything to the Swift compiler is to find some construct that has already been contributed and look through pull releases to discover what parts of the language they had touched.

That’s how I got started with my assertions/optimization revision. I looked at the recent canImport()pull, and targeted the seven files that involved. In the end, I only  needed to modify four files in total, excluding tests. It was a fairly “clean” and simple update.

To add #dogcow, again excluding tests, I had to change nearly two dozen files, most of them written in C++, a few using Python and Swift’s own gyb (aka “generate your boilerplate”) macros.

I’ve put up a gist that details my notes as I performed file edits. (I did have to make some further changes once I started testing.) Each group consists of a file name followed by my changes, with some context around them to make it easier to return to those parts of the file.

That’s followed by any relevant grep results that encouraged me to edit the file in question plus error messages from the compiler, of which there were a few, as I made several typos and forgot at times to add semicolons to the ends of lines. (Damn you semicolons! shakes fist) I put ### markers near the errors to make them easier to find in the notes.

As you walk through my notes, you’ll notice that I had to create a token (pound_dogcow), which is a kind of MagicIdentifierLiteralExpr expression. By inserting a simple token without arguments and returning a string, I cut down on my need to parse additional components or produce a complicated return value.

(Sorry Dave! I’ll get there I hope… After all, I know where each of the five components of Context live: file, line, column, function, and dsohandle. I just don’t know how to build and export the struct so that it gets put into place and can be consumed by the Swift user.)

As a string, my #dogcow can be used as a literal, so I conformed it to KnownProtocolKind::ExpressibleByStringLiteral. It needed to be serialized and deserialized, emit its custom string, support code completion, and more. Scrolling down my file, you’ll see the scope of notes including searches, comments, and edits for this one change.

Debugging

One of the most interesting things that happened during this exercise was when I made an actual logic error, not a syntax error, so Swift compiled but my program failed:

Assertion failed: (isString() && "Magic identifier literal has non-string encoding"), function setStringEncoding, file /Users/ericasadun/github/Apple/appleswift/swift/include/swift/AST/Expr.h, line 1052.

For the longest time I was convinced (wrongly) that because I was using Unicode, that I had somehow screwed up the string encoding. This was actually a coding mistake, an actual bug, and had nothing to do with the way my string was built nor the fact that I used emojis. It took a while to track down because my head was in the wrong place.

Notice I have DogCow returning true here. I accidentally swapped the two lines so it was originally returning false, falling into the Line/Column/DSOHandle case.

 bool isString() const {
   switch (getKind()) {
    case File:
    case Function:
    case DogCow: // it's a string!
      return true;
    // it should not have been down here
    case Line:
    case Column:
    case DSOHandle:
      return false;
  }
  llvm_unreachable("bad Kind");

Proof-of-Concept

Once compiled, I used a few simple calls to test my work. Here’s the source code I used. I accidentally added an extra space in the assignment test. You can see in the screenshot as well:

// String interpolation and default argument
func hello(_ greetedParty: String = #dogcow) {
    print("Hello \(greetedParty)")
}

hello()
hello("there")

// Use in assignment
let aDogCow = #dogcow
print("The value is ", aDogCow)

// Use directly in statement
print(#dogcow)

Lessons Learned

Having built a working version of the compiler incorporating my solo changes, no matter how trivial and yes it was extremely trivial, has been a big confidence builder. Exploring the process from consuming tokens to emitting intermediate language representations has enlightening.

  • I learned to update everything and build from scratch before starting my work. Because if you don’t, you’ll end up doing it later and wasting that time.
  • I learned how to track down similar modifications and use them as a template for exploring what parts of the compiler each change touched.
  • I learned that some errors would not be in the compilation but in the testing, as one tends to forget things like “just because it built doesn’t mean it will compile correctly” when one is very very focused on getting things to run and extremely new to the process.

I have now worked on two (technically three) compiler modification projects. Each has  taught me something new. If you’d like, take a peek at some explorations I’ve pushed to my forked repo:

The DogCow changes are clean, in the style of something that I might actually do a pull request for. The optimization checks are not. They retain all my little in-line notes I use for searching through text files to find what I’ve changed.

The early debug checks represent the time before I could get all the compiler tools built on my system. I was basically programming in my head at that point, guessing what would work or not, before the conversation on Swift forums moved me to my current design.

My guesswork was wrong. I focused on using a trio of built-in functions (like _isDebugAssertConfiguration) mentioned on-list. This turned out to be a non-viable solution. I needed to follow the example set by canImport to set my flags.

Finally, a word to the wise: Don’t ./utils/build-script -R in one terminal window and ninja swift in another at the same time. Oops.

In which I get hacked: Part 5

Last week, it started again. Numerous people on Twitter from various sites around the world reported my site looked like this:

Nothing on my site itself indicated any changes but something was hacking my Bluehost-sourced content and replacing it with its own. I was unfamiliar (as with all things to do with web hosting) with how this was happening, and (spoilers) I never figured that out.

What I did discover, with the help of Jared Lander from Bluehost, was how to mitigate the problem.  (Thanks Jared!) Forcing the site to exclusively use https links through a WordPress plug-in bypassed the whole “evil hackers will redirect my http content” thing. “All” it took was asking everyone to clear caches, restart web browser, or wait for the changes to propagate over time. It’s been 7 days and I have not had any reports of further issues.

The plug-in in question is Really Simple SSL, which automatically configures websites to use secure links: just get an SSL certificate, install the plugin, and activate it. It’s step 1 that’s a burden in many cases.

This cleared up the non-English main page advertising (for those who saw it) and apparently something that affected my RSS feed. I’m not entirely sure what that was, but there had been some stray BOM characters leaking into that which resolved once the SSL/https problem was addressed. Dave Jones wrote me that my feed, which hadn’t been validating, began working properly after the plug-in.

Much as I hope that my issues are over, I’m assured by a number of people contacting me that it’s not possible to run a secure website anymore without the help of professionals.

I’ll do the best I can. I’ll keep backing up my content. But that’s about all I can do.

Thank you again to everyone who reached out to me with support, feedback, and information.

Cleaning up doc comments for formatted commits

I’m working on a proposal to introduce CountedSet, cousin to NSCountedSet, to Swift. This kind of type involves a massive amount of doc comment content. I decided to adapt the comments from Cocoa Foundation (for NSCountedSet) and Swift Foundation (for Set) as part of my coding and quickly found how ridiculous it was to do this by hand.

At first I tried to write an Xcode “reflow doc comments” extension but as I found in the past, Xcode extensions are a dreadful pain to program and debug. It really wasn’t worth doing this (although it would be my preferred workflow for use) in terms of spending my time well.

Instead, I decided to create a simple playground. I’d paste my Swift file into a known Resources file (in this case, test.swift, although I’m sure you can come up with a better name than that if you use this). I’d process the text with a simple playground implementation and print to stdout.

It was an interesting problem to solve and one that took slightly longer than I anticipated. It’s also one that’s only partially complete. The log jams involved looking ahead at the next line to decide when each blob of text was complete so it could be reflowed, preserving paragraph breaks in the comments, respecting code blocks, and leaving any in-file code intact. Reflowing the words was much easier. I’m sure you’ve written that part of it in any number of algorithms and intro-language classes.

The parts I didn’t tackle were the special formatting required for doc comment keywords, like - Parameter, - Returns, - SeeAlso, and so forth. The associated lines for these items must be reflowed with proper indentation so the Quick Help parser can properly parse them. I leave that for another day because they are relatively minor work compared to reflowing long and complex doc comments as a whole.

I’ve put my code up on Github if you want to offer improvements, fixes, or feedback:

 

In which I get hacked: Part 4

My site was broken into again last night and was down until this morning. What fun.

Welcome to another day of hackage fun.

This morning, the odd “consig.php” with its Base 64 contents and eval was back, along with an updated “.htaccess” file. I called up Bluehost support. They restored a safe version of “.htaccess” and suggested I talk with their “security people”. I said sure.

After a longish hold, they transferred me…to an entirely different company, who for starting at just $40 or so a month (and up), would provide me with a fire wall. The sales pitch was strong, as was their disdain and pricing structure. I thanked them for their time and hung up.

It was time to call back Bluehost again and yell at them a little. This time, they offered to set up an SSL certificate for 90 days at a time.

This might be even more exciting if their own certificate didn’t seem to be wonky:

I also deinstalled and re-installed my plugins, and generally followed the advice passed to me this morning by Jan Östlund in his helpful tweet.

Frankly, it’s been a pretty dreadful day. Of course, the kind words of support from everyone have been a lovely counterpart to the misery of doing sys admin work. I’m surprised so many people have had to deal with this exact situation before. At least I’m not alone.