advertisement
5 hot infrastructure and operations trends — and 4 going cold
The IT world is one of flux. New tools and strategies arise continually to challenge the status quo. Sometimes a…
The IT world is one of flux. New tools and strategies arise continually to challenge the status quo. Sometimes a clear winner leaves vanquished approaches to the dustbin of computing history. Other times, change is more like a pendulum that swings one way before swinging back.
Infrastructure and developer operations see their share of change, but on a bit more tempered pace than in other tech domains. The teams responsible for curating the code and keeping systems running smoothly are naturally careful. Experimentation and change for the sake of change is for the high-strung innovators down in the skunkworks. When the company depends on everything running smoothly, keeping infrastructure and operations stable is more important.
Yet many new strategies and tools have arrived of late to transform how back offices do the heavy lifting of keeping the servers and networks running. Some of these trends are driven by new innovations, some by pure economics and some by political realities. All reflect the way the teams are pushed to provide more security and faster speeds without sacrificing stability.
advertisement
Hot: Multicloud
The advantages of moving code out of the server room and into the cloud have long been recognized. A rented pool of machines maintained by someone else is ideal for intermittent computations and workloads that rise and fall. There will always be questions about trust and security but the cloud vendors have addressed them carefully with dedicated teams made possible with economies of scale.
If one cloud is a good idea, why not two or three or more? Supporting multiple clouds can take more work, but if your developers are careful in writing the code, they can remove the danger of vendor lock-in. And your accountants will appreciate the opportunity to benchmark your software in multiple clouds to figure out the cheapest providers for each workload.
Cold: Dynamic websites
At its outset, the World Wide Web was made up of static files. Web servers received a URL and responded with a file that was the same for everyone. This simple mechanism quickly fell out of favor when developers realized they could customize what users could see when visiting a particular URL. Web pages no longer needed to be the same for everyone. Users liked the personalization. Advertisers liked the flexibility in targeting. Businesses like the opportunities a dynamic web presented. So elaborate frameworks arrived to help create custom pages for anyone who wanted one.
advertisement
This attitude has changed of late, as developers and businesses have recognized that, despite all the options, most web pages end up being pretty much the same for everyone. Is all the overhead of creating clever server logic worth it? Why not just send the same bits to everyone using all the speed of edge-savvy content distribution networks instead. Now, some of the newest web development tools take your site and pre-distill it down to a folder of static web pages so you can have all the flexibility of a dynamic content management systems served up with the speed of static files. The results aren’t completely static, however, because a bit of JavaScript can fill in holes or collect some customized data using AJAX calls.
Hot: Cloud on premises
As part of their sales pitches, cloud vendors have always pushed the freedom of letting go of your data and code. Hand it over to them and they’ll take care of everything. While they do give you some say over the geographical location where your code is hosted, as long as everything is humming along, there isn’t much need for knowing what’s going on with the machines you rent in the cloud.
Some companies, however, do care. They like to have their data down the hall where everyone can stop by to see the LEDs and listen to the whir of the fans. It just feels more secure and some companies need to protect their data at a higher level than most. The solution? Run the cloud company’s software and tools on your local machines. It feels like the cloud when you provision the instances, but the boxes are where you can touch them. This combines the flexibility of the cloud’s virtual instances with the emotional security of taking physical control of the machines. Plus, sometimes this approach can be cheaper — if you can manage the extra costs of installing and caring for the hardware.
advertisement
Cold: AI for everything, everywhere
When the world of artificial intelligence exploded several years ago, everyone rushed to point the artificial intelligence system at anything and everything. Huge datasets appeared as teams gathered every bit they could find. More information meant more training opportunities for the AIs and that was supposed to yield smarter, more accurate results.
This overreach has raised alarm bells. Some are beginning to see the threat to privacy that comes with gathering the massive amount of information necessary for capitalizing on AI. Others worry that the data sets that are being accumulated are uneven and biased, raising the distinct possibility that their AI would learn only to echo this bias. Others have fretted about how AIs might become too powerful, controlling too many parts of the decision chain. Now AI developers are expected to do more than answer whether the job can be done. They must weigh the dangers and consider whether the job should be done. This is also leading to the rising need for “explainable AI.”
Hot: Serverless
For a long time, developers have wanted complete control over their environment. That’s because, if they couldn’t specify the exact distribution and version, they wouldn’t be able to guarantee their code would work correctly. Too many learned the hard way that inconsistencies can be fatal. So they wanted root access to a machine that they controlled.
All of those copies of the same files may keep everything running smoothly, but it’s inefficient and wasteful. New serverless tools squeeze all that fat out of the system. Now developers can worry only about writing to a simple interface that will load their code just when needed and bill you only then. It’s a godsend for jobs that run occasionally, whether they’re background processing or a website that doesn’t get much traffic. They don’t need to sit on an server with a complete copy of the operating system taking up memory and doing nothing.
Cold: Thin components
Developers often build their masterpieces by threading together a collection of smaller components and libraries. Each part contributes a bit of information to the entire package. Many of the parts are off-the-shelf products such as databases or popular APIs. It’s not unusual for dozens or even hundreds of parts to work together to produce a unified web presence for the user.
Lately, though, the products have been getting smarter on their own as their creators add more features. Some databases, for instance, are more tightly integrated with the network and they offer to synchronize data stored on the clients, removing the need to build this functionality. Features such as translation are now folded into other tools. As applications and services grow fatter, the glue code and customization disappears. Sometimes it turns into configuration files and sometimes it disappears altogether. The flowchart still encompasses the same functionality, but now the boxes are fatter and there are fewer of pieces to pull together and keep on top of.
Hot: Green AI
For the past few years, when it comes to machine learning and AI, the more comparisons, more computations, and more training data, the better. If you wanted to make the most of AI, going big was the path to better results.
Computation, however, requires electricity, and many companies are starting to wonder whether a big algorithm with a big carbon footprint is really necessary. This is spurring AI developers to test whether they can return results that are almost as good — or at least good enough — without making the electricity meter (and subsequent cloud or on-premises costs) spin like a top.
Cold: Basic repositories
In the past, a code repository didn’t have to do much to earn its keep. If it kept a copy of the software and tracked changes over time, everyone was amazed. Now developers expect repositories to push their code through a pipeline that could include anything from basic unit tests to complicated optimizations. It’s not enough for the repository to be a librarian any more. It must also do the work of a house keeper, a fact checker, a quality control expert and sometimes even a cop. Smart development teams are leaning more on the repository to enforce discipline.
Hot: Automators
In the past, you needed to write some code to get anything done. Someone needed to fuss over variables and remember all of those rules about types, scope and syntax. Then everyone needed to listen to them prance around like Michaelangelo talking up their rules about code quality, which often boiled down to pronouncements about non-functional white space (see 18.3 and 19.4).
New tools with names like “robotic process automation” are changing the dynamic. There are no droids like C3PO, though, just amped up data manipulation routines. Now savvy non-programmers can accomplish quite a bit using tools that remove most of the rough edges and gotchas from the development process. Anyone who can handle adding up a column on a spreadsheet can produce some pretty elaborate and interactive results with just a few clicks and no blather about closures.