{{announcement.body}}
{{announcement.title}}

Top 5 Things Development Writers Need to Leave in 2018: A Rant

DZone 's Guide to

Top 5 Things Development Writers Need to Leave in 2018: A Rant

Let well-worn phrases be forgot.

· Agile Zone ·
Free Resource

yelling into a bullhorn

Here's what to leave in 2018

Now that 2018 is finally (finally) drawing to a close, the Editorial team at DZone is looking ahead to the new year in anticipation of the fresh, new content that we will receive from our community of software developers. The excitement is palpable.

But, first, we need to address something with you. You might want to have a seat. No, you're not in trouble; we're your friends, and we're here to help you. 

It has come to our collective attention that there are some among you, developers, writers, and marketers across Zones, who are dealing with an illness that has had a noticeable effect on your content. Don't worry, it's not your fault; this is caused by a highly-contagious virus called maximus banalitus, more frequently referred to as the common clichè. One of the most troubling symptoms of the common clichè is an overuse of certain phrases and writing practices to the extent that they become redundant or lose their luster. 

Now, this isn't to frighten you, just to inform you. As common as this affliction may be, I was taught that there is also an ancient cure for this, concocted from two ingredients that we, fortunately, have in abundance:

Honesty and a pinch of humor.

1. Serverless Actually Does Have Servers?! Gasp.

This one deserves a little personal background. I am one of the five Content Coordinators that curates, edits, and posts the content of the wonderful DZone community onto the website, and if I may indulge myself, I think I pick some pretty quality content. But it certainly didn't start that way, especially not with the Cloud Zone. Enterprise cloud, containers, AWS, and Docker vs. (or with) Kubernetes were all subjects I had no familiarity with before I began working here almost one year ago.

All I knew about cloud technology was that my iTunes library was filling up before I opted for streaming services. I'm still convinced Spotify was handed to me directly from the heavens, and I treat it with the proper reverence.

All that being said, I knew some of the basics of computing: monitor, keyboard, OS, server. These are the essentials, the non-negotiables, and although the specific mechanics have changed and been developed over time, they still stand. So for the most mildly tech-savvy consumer, the notion that the term "serverless" does not actually represent some omnipresent ethos behind the computing power of millions of computers is no surprise.

For developers, the well-worn revelation is as predictable as it is grating.

Image title

Image title


There are so many more interesting things to say about serverless computing, such as its security implications, or how it fits in with DevOps practices. Now the name is, admittedly, a little misleading, and if there are really any devs out there who haven't heard of it, check out this foundational article. Then take a long, hard look at your career.

But for the rest of us who are initiated, please, let's move forward with the assumption that this is no longer news.

2. Saying "Software (or Agile, or IoT, or...) is Eating the World"

If you think about it, time is the judge of a good metaphor. Some have been so impactful that they become historical taglines for their creators, famed quotes passed down through time. "All the world's a stage, and all the men and women merely players." "Chaos is a friend of mine." "Conscience is a man's compass." Others, proving themselves useful (and profound in their initial use), have been passed down and around for so long that they become time-honored parts of common vernacular, like "time is money," or "fit as a fiddle."

History will never give "software is eating the world" either of these designations, for several reasons.

First, this particular saying is almost comically hyperbolic. "Eating" the world is a little dramatic for a phenomenon that has been decades in the making, and not in a way that makes it useful or impactful. This particular phrase is intended to express the idea that software is becoming deeply integrated into the global citizen's life. Instead, all it really creates is a mental image of some removed and malicious entity called "software" that is threatening to consume...well, the world. Which works if you're making a Terminator-esque, dystopian point, but again, not the intended purpose.

"But imagery is so subjective." Mmm, it's kinda the point of a metaphor, but fine. But that brings me to my second point: it's late as hell. There's nothing remotely new or original in talking about the proliferation of software and technology from either a consumer or a developer standpoint — we've been in collective awe of computers and software since they became household goods. Now that we're at a point (and have been for some time) that software is ubiquitous in almost every second of our lives in some way, software developers are in increasingly high demand, and over half the world is online, stating the obvious as a preamble to the latest technology is simply redundant.

Third, it's just lazy. "Software is eating the world" sounds pretty juvenile as metaphors go, the product of a young child trying to articulate a vision in his head. It lacks any of the real depth or profundity to bring any life to the words in this phrase.

My point is, let's try to revise this one for the new year into something less...frightening, shall we? You're not making an impression.

You're scaring the kids.

3. Re-Redefining Artificial Intelligence and Machine Learning

Image title

Chances are, if you either grew up in North America or are familiar with Western film and television, you've seen some depiction of the archetypal robot or software-based entity, either fighting the odds and trying to save the world or hell-bent on its destruction. From the Avengers' Ultron and I, Robot's Sonny, to the Jetsons' maid Rosie the Robot and the iconic R2D2, artificial intelligence has been depicted as good, evil, intelligent, goofy, half-human, and disembodied. 

One thing it hasn't been for a long time is "new." Even in thematic depictions that involve some form of artificial intelligence, they are either commonplace or original in their purpose, not their mere existence. Now, admittedly, the artificial intelligence in your favorite book or movie has been exaggerated, but many of the basic attributes of artificial intelligence and their social implications remain intact and are reflected in real life. For example, the Iron Giant developed rudimentary language skills much like (and from) a child, as this MIT-developed parser is being trained to do. Technologists and developers are creating guidelines to govern humane AI development, similar to the Three Laws of Robotics from I, Robot.

It's more than fair at this point to presume that everyone has a basic working understanding of what artificial intelligence is, so let's talk more about what AI does and can do on a social level, consider the effects it will have on the world we know, and try to understand how we can effectively guide its development. 

Before it's too late.

4. "Digital Transformation." Just...Digital Transformation.

I'll let our Editorial Team Lead Mike Gates take the lead on this one:

"More used by marketers than developers, 'digital transformation' seems to have become the de facto term to mean finding more/better data sources and making use of that data for better decisions. After entire seconds of thought, it turns out that we already have words for that: technology and automation. The problem with digital transformation is that while it kinda sorta implies what it’s about, there are no defined principles behind it, making it useless in terms of actually solving problems.

Not that marketers will ever admit it…"

Well, there you have it, folks. Stop using "digital transformation" when "automation" or simply "technology" will do, and pass the memo on to your respective marketing departments.

5. Waxing Prophetic About the Wonders of IoT

I'm fond of this time of year as an editor. As the year rapidly comes to a close and the software industry collectively reflects on the successes, failures, and trends of the previous three quarters, developers on the front lines break out their crystal balls for the next wave. These predictions seem to be competitive nets cast with the hope that, at the end of the next year, those with technological third eyes will have a reflection of their professional expertise and personal dedication to the religion of software development.

The future of IoT (or Internet of Things, as many are fond of reminding us), however, is a topic of year-round discussion, particularly in the vein of forecasting. While predictions for other topics give way to their more specific and present applications, conversations among some IoT industrialists seem to grow ever more vague, culminating in these end-of-year prediction playoffs that look 10 years forward in an industry that changes weekly. 

Now, don't get me wrong, there is absolutely nothing wrong with testing the mettle of your combined knowledge and intuition to see what's around the corner. Trying to see ten blocks down is what usually results in citations of Gartner's prediction that 95% of devices will be connected by 2020. Certainly one of the more tame and interesting glimpses, to be sure, and from a reputable source, but with sometimes very little context about the nature of this field as it grows, its social ramifications, or the motivators or steps that will get us there.

Bonus: AngularJS ≠ All Versions Angular

One of our other editors, Jordan Baker, brought to my attention that some writers missed the memo that AngularJS is no longer the blanket term to refer to Google's popular web development framework. In fact, there are three versions of Angular, and only the first, based on JavaScript, can be accurately referred to as AngularJS. The other two, Angulars 2 and 4 (here's why they skipped 3) were developed as a complete rewrite of the framework, this time based on TypeScript. And although there are still some developers who may prefer to use AngularJS, it should be understood that this refers exclusively to Angular 1.x. 

Anything else will be treated as a crime punishable by a sentence of code review for a minimum of five codebases written in COBOL.


The general conclusion here is, if some of your favorite phrases or writing practices are listed here in some variation, it might be a good idea to consider taking a fresh perspective on them or, better yet, leave them in 2018 altogether. Because a better year starts with better writing. 

Topics:
cliches, dev writing, phrases, software development, writing

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}