It’s true that inorganic users don’t yell at customer-service reps or trash-talk companies on Twitter. But connected devices can also benefit from some less-obvious upgrades that 5G should deliver—and we, their organic overlords, could profit in the long run.
You may have heard about 5G’s Internet-of-Things potential yourself in such gauzy statements as “5G will make every industry and every part of our lives better” (spoken by Meredith Attwell Baker, president of the wireless trade group CTIA, at the MWC Americas trade show in 2017) and “It’s a wholly new technology ushering in a new era of transformation” (from Ronan Dunne, executive vice president and CEO of Verizon’s consumer group, at 2019’s Web Summit conference).
But as with 5G in the smartphone and home-broadband contexts, the ripple effects alluded to in statements are potentially huge—and they will take years to land on our shores. Yes, you’ve heard this before: the news is big, but it’s still early days.
Massively multiplayer mobile bandwidth
The long-term map for 5G IoT promises to support a density of devices far beyond what current-generation LTE can deliver—up to a million “things” per square kilometer, versus almost 61,000 under today’s 4G. That density will open up possibilities that today would require a horrendous amount of wired connectivity.
For example, precision-controlled factories could take advantage of the space in the airwaves to implement extremely granular monitoring, and 5G IoT promises to do that job for less. “You can put tons of environmental sensors everywhere,” said Recon Analytics founder Roger Entner. “You can put a tag on every piece of equipment.”
“Either I upgrade this to fiber to connect the machines, or I use millimeter-wave 5G in the factory,” echoes Rüdiger Schicht, a senior partner with the Boston Consulting Group. “Everything we hear on reliability and manageability of that infrastructure indicates that 5G is superior.”
Millimeter-wave 5G runs on bands of frequencies starting at 24GHz, far above the frequencies employed for LTE. The enormous amounts of free spectrum up there allow for gigabit speeds—at the cost of range, which would be limited to a thousand feet or so. That still exceeds Wi-Fi’s reach, though.
Low-band 5G on the same frequencies today used for 4G doesn’t allow for a massive speed boost but should at least cover far more ground, while mid-band 5G should offer a good mix of speed and coverage—at least, once carriers have more free spectrum on which to provide that coverage. (If you’d like a quick refresher on the various flavors of 5G, our story from a couple of weeks ago has you covered!)
In the United States, fixing those spectrum issues hinges on the Federal Communications Commission’s recently-announced plan to auction off 280MHz of so-called C-band spectrum, between 3.2MHz and 3.98MHz, on a sped-up timetable that could see those bands in service in two to three years.
And that means there’s some time to figure things out. Companies aren’t lighting up connected devices by the millions just yet.
The current 5G standard—formally speaking, 3GPP Release 15—does not include support for the enormous device density we’re talking about. That will have to wait until Release 16, now in its final stages of approval, although Entner warns that we won’t see compatible hardware for at least another year or two.
When it comes to the possibility of home broadband competition, we want to believe. And in the case of 5G mobile broadband, wireless carriers want us to believe, too. But whether or not technological and commercial realities will reward that faith remains unclear. As with 5G smartphones, the basic challenge here sits at the intersection of the electromagnetic spectrum and telecom infrastructure economics.
When delivered over millimeter-wave frequencies and their copious amounts of free spectrum, 5G can match the speed and latency of fiber-optic broadband, with downloads of 1 gigabit per second and ping times under 10 milliseconds. But on those frequencies of 24GHz and up, signals struggle to reach more than a thousand feet outdoors. Carriers can fix that by building many more cell sites, each with its own fiber backhaul, but a fiber-to-the-block build-out may not be appreciably cheaper than fiber-to-the-home deployments. And while residences don’t move and don’t mind wireless antennas larger than a shirt pocket—unlike individual wireless subscribers—residences also have walls that often block mmWave signals. (Presumably also unlike individual wireless subscribers.)
The other frequency flavors of 5G (the low- and mid-band ones) don’t suffer mmWave’s allergies to distance or drywall. But they also can’t match its speed or its spectrum availability—which in the context of residential broadband means they may not sustain uncapped bandwidth.
So as much as residential customers might yearn for an alternative to their local telecom monopoly—or for any form of high-speed access besides laggy connectivity from satellites in geosynchronous orbit—5G doesn’t yet rank as a sure thing. There’s a promise, but many things still need to go right for that promise to be fulfilled.
Or, as New Street Research analyst Jonathan Chaplin phrased things in an email: “If your fundamental question is ‘will 5G allow you to dump Comcast’ the answer is absolutely! Depending.”
Verizon’s bet on millimeter-wave broadband
At $70 a month for unlimited data—with a $20 discount if you have a $30 or higher Verizon Wireless smartphone plan—and with download speeds from 300 to 940 megabits per second, the service would compare well with cable even if so many cable Internet plans didn’t include data caps and slap on modem-rental fees.
Reddit threads about the service in Houston, Sacramento and elsewhere offer a mix of praise for its performance (including reports of upload speeds in the range of 200Mbps, significantly faster than what most cable services offer) and complaints about it not being available at individual redditors’ addresses.
“Towards the beginning of service, there were a few firmware issues with the modem Verizon provided, but they patched that within a month,” said a software engineer in Sacramento who asked not to be named. “Since then, there’s not been significant downtime that I noticed.”
“Overall I’m happy with my 5G,” wrote another 5G Home user in Houston who runs a crisis-management firm. “No downtime that I can remember. I don’t have my exact speeds but it seems pretty quick. More than enough for my TV streaming and Web surfing.”
“There were only a few short (less than 30 min?) cases of 5G service downtime that I can recall, and they were all mostly toward the beginning of my service, so I imagine they were able to fix those stability issues quickly enough,” wrote Vincent Garcia, a software engineer in Sacramento. “My speeds seem to be the same as when I first got the service: 300-600 Mbps down, 120-140 Mbps up.”
Garcia noted one other benefit: “One interesting thing I’ve noticed is that other ISPs in my area seem to have stepped up their game in terms of value (at least in terms of their initial contract period).”
One early fear raised about millimeter-wave 5G, that it would suffer from “rain fade” akin to what cuts out satellite-TV reception during showers, doesn’t yet appear to have emerged as a serious problem. Those Reddit discussions about Verizon’s service don’t mention it, while a Twitter search reveals no firsthand reports of rain-faded 5G.
Ashutosh Dutta, a research scientist at the Johns Hopkins University’s Applied Physics Laboratory, pointed to a 2019 study by researchers at the Indian Institute of Information Technology Kalyani and the University of Calcutta’s Institute of Radio Physics and Electronics in West Bengal, India. They found that “proper fade mitigation techniques” can keep even heavy rain from disrupting millimeter-wave communication at frequencies up to 40 GHz. Verizon’s 5G Home, at 28 and 39 GHz, sits on the forgiving side of that line.
We’re going to try something a little different this morning. Partially in response to several requests for more maker-focused videos and partially because my executive producer is head-over-heels in love with Pocket Circuit racing in Yakuza 0, we’re bringing you the first in what we hope to make into a series called “Mini Motors,” and it’s all about tiny cars going really fast.
RC racing in all its various forms has always been a maker-y kind of hobby, and Mini 4WD serves as an excellent genre example to start with. You take a 1:30-scale battery-powered car, spend days carefully and patiently tuning the crap out of it, and then you set it loose on a curving track as fast as its little wheels can make it go—up to 40 miles per hour (about 65km/h). The Mini 4WD that wins does so by a mixture of careful planning, careful engineering, and a big heaping of pure luck.
Must go faster
For this video, we spent time talking Mini 4WD with Randy Holt, owner of the HobbyTown store in Toms River, New Jersey. The biggest factor that sets Mini 4WD apart from other RC cars is that Mini 4WD cars are hands-off during the race—once the green flag waves, the cars are on their own. They zip around the track, steered by the cars’ built-in bumpers and rollers pushing against the track walls. Though the track appears to have multiple lanes in parallel, it’s actually a single lane that spirals around the circuit, connected by a jump-over. This ensures that all the Mini 4WDs on the track are all racing the same total distance (because otherwise the inner lanes would be shorter than the outer lanes).
Holt gives us a nice overview of Mini 4WD cars, the different race classes, and a bit of a primer on tuning and engineering. The big takeaway is that the sport is friendly to newcomers and easy to get into—you can spend $15 or so on the Tamiya Yaris shown in the video, which can be assembled and ready to race in about 45 minutes. It’s also a hobby that grows with you, and at the extreme end—if your interest runs that deep—you might find yourself adding carbon fiber parts and tweaking rollers and brakes by the millimeter to eke out faster lap times. Mini 4WD has something for all levels of racer, from casual to crazy.
A whole new world
This video has also been my introduction to 4WD Mini—and it’s a vast world with a long history, stretching back to the ’90s. Video editor Aulistar Mark is a veritable fountain of 4WD Mini trivia, and he passed this tidbit to me in email as the edit was being locked:
Mini 4WD is an interesting international phenomenon. One aspect we didn’t get into, is the 90s Anime Bakusō Kyōdai Let’s & Go!! which is bound to come up in the comments. Bakusō Kyōdai Let’s & Go!!, was localized in the US as the Saturday morning cartoon “Let’s & Go!!”. The series also had several licensed games for multiple platforms in the 90s, with a couple remasters released for mobile. This would be a precursor to the Yakuza series Mini 4WD mini-game. It’s great stuff for nostalgia, since the 90s cartoons were very much made like Bandai/Hasbro cartoons designed to sell toys.
If you guys like this pilot and like the series concept, we’d love to hear some ideas in the comments for additional racing circuit types to check out—please let us know!
Artificial Intelligence—or, if you prefer, Machine Learning—is today’s hot buzzword. Unlike many buzzwords have come before it, though, this stuff isn’t vaporware dreams—it’s real, it’s here already, and it’s changing your life whether you realize it or not.
A quick overview of AI/ML
Before we go too much further, let’s talk quickly about that term “Artificial Intelligence.” Yes, it’s warranted; no, it doesn’t mean KITT from Knight Rider, or Samantha, the all-too-human unseen digital assistant voiced by Scarlett Johansson in 2013’s Her. Aside from being fictional, KITT and Samantha are examples of strong artificial intelligence, also known as Artificial General Intelligence (AGI). On the other hand, artificial intelligence—without the “strong” or “general” qualifiers—is an established academic term dating back to the 1955 proposal for the Dartmouth Summer Project on Artificial Intelligence (DSRPAI), written by Professors John McCarthy and Marvin Minsky.
All “artificial intelligence” really means is a system that emulates problem-solving skills normally seen in humans or animals. Traditionally, there are two branches of AI—symbolic and connectionist. Symbolic means an approach involving traditional rules-based programming—a programmer tells the computer what to expect and how to deal with it, very explicitly. The “expert systems” of the 1980s and 1990s were examples of symbolic (attempts at) AI; while occasionally useful, it’s generally considered impossible to scale this approach up to anything like real-world complexity.
Artificial Intelligence in the commonly used modern sense almost always refers to connectionist AI. Connectionist AI, unlike symbolic AI, isn’t directly programmed by a human. Artificial neural networks are the most common type of connectionist AI, also sometimes referred to as machine learning. My colleague Tim Lee just got done writing about neural networks last week—you can get caught up right here.
If you wanted to build a system that could drive a car, instead of programming it directly you might attach a sufficiently advanced neural network to its sensors and controls, and then let it “watch” a human driving for tens of thousands of hours. The neural network begins to attach weights to events and patterns in the data flow from its sensors that allow it to predict acceptable actions in response to various conditions. Eventually, you might give the network conditional control of the car’s controls and allow it to accelerate, brake, and steer on its own—but still with a human available. The partially trained neural network can continue learning in response to when the human assistant takes the controls away from it. “Whoops, shouldn’t have done that,” and the neural network adjusts weighted values again.
Sounds very simple, doesn’t it? In practice, not so much—there are many different types of neural networks (simple, convolutional, generative adversarial, and more), and none of them is very bright on its own—the brightest is roughly similar in scale to a worm’s brain. Most complex, really interesting tasks will require networks of neural networks that preprocess data to find areas of interest, pass those areas of interest onto other neural networks trained to more accurately classify them, and so forth.
One last piece of the puzzle is that, when dealing with neural networks, there are two major modes of operation: inference and training. Training is just what it sounds like—you give the neural network a large batch of data that represents a problem space, and let it chew through it, identifying things of interest and possibly learning to match them to labels you’ve provided along with the data. Inference, on the other hand, is using an already-trained neural network to give you answers in a problem space that it understands.
Both inference and training workloads can operate several orders of magnitude more rapidly on GPUs than on general-purpose CPUs—but that doesn’t necessarily mean you want to do absolutely everything on a GPU. It’s generally easier and faster to run small jobs directly on CPUs rather than invoking the initial overhead of loading models and data into a GPU and its onboard VRAM, so you’ll very frequently see inference workloads run on standard CPUs.
The long-touted fifth generation of wireless communications is not magic. We’re sorry if unending hype over the world-changing possibilities of 5G has led you to expect otherwise. But the next generation in mobile broadband will still have to obey the current generation of the laws of physics that govern how far a signal can travel when sent in particular wavelengths of the radio spectrum and how much data it can carry.
For some of us, the results will yield the billions of bits per second in throughput that figure in many 5G sales pitches, going back to early specifications for this standard. For everybody else, 5G will more likely deliver a pleasant and appreciated upgrade rather than a bandwidth renaissance.
That doesn’t mean 5G won’t open up interesting possibilities in areas like home broadband and machine-to-machine connectivity. But in the form of wireless mobile device connectivity we know best, 5G marketing has been writing checks that actual 5G technology will have a lot of trouble cashing.
A feuding family of frequencies
The first thing to know about 5G is that it’s a family affair—and a sometimes-dysfunctional one.
Wireless carriers can deploy 5G over any of three different ranges of wireless frequencies, and one of them doesn’t work anything like today’s 4G frequencies. That’s also the one behind the most wild-eyed 5G forecasts.
Millimeter-wave 5G occupies bands much higher than any used for 4G LTE today—24 gigahertz and up, far above the 2.5 GHz frequency of Sprint, hitherto the highest-frequency band in use by the major US carriers.
At those frequencies, 5G can send data with fiber optic speeds and latency—1.2 Gbps of bandwidth and latency from 9 to 12 milliseconds, to cite figures from an early test by AT&T. But it can’t send them very far. That same 2018 demonstration involved a direct line of sight and only 900 feet of distance from the transmitter to the test site.
Those distance and line-of-sight hangups still persist, although the US carriers that have pioneered millimeter-wave 5G say they’re making progress in pushing them outward.
“Once you get enough density of cell sites, this is a very strong value proposition,” said Ashish Sharma, executive vice president for IoT and mobile solutions at the wireless-infrastructure firm Inseego. He pointed in particular to recent advances in solving longstanding issues with multipath reception, when signals bounce off buildings.
Reception inside those buildings, however, remains problematic. So does intervening foliage. That’s why fixed-wireless Internet providers using millimeter-wave technology like Starry have opted for externally placed antennas at customer sites. Verizon is also selling home broadband via 5G in a handful of cities.
Below millimeter-wave, wireless carriers can also serve up 5G on mid- and low-band frequencies that aren’t as fast or responsive but reach much farther. So far, 5G deployments outside the US have largely stuck to those slower, lower-frequency bands, although the industry expects millimeter-wave adoption overseas to accelerate in the next few years.
“5G is a little more spectrally efficient than 4G, but not dramatically so,” mailed Phil Kendall, director of the service provider group at Strategy Analytics. He added that these limits will be most profound on existing LTE spectrum turned over to 5G use: “You are not going to be able to suddenly give everyone 100Mbps by re-farming that spectrum to 5G.”
And even the American carriers preaching millimeter-wave 5G today also say they’ll rely on these lower bands to cover much of the States.
For example, T-Mobile and Verizon stated early this year that millimeter-wave won’t work outside of dense urban areas. And AT&T waited until it could launch low-band 5G in late November to start selling service to consumers at all; the low-resolution maps it posted then show that connectivity reaching into suburbs.
Sprint, meanwhile, elected to launch its 5G service on the same 2.5GHz frequencies as its LTE, with coverage that is far less diffuse than millimeter-wave 5G. Kendall suggested that this mid-band spectrum will offer a better compromise between speed and coverage: “Not the 1Gbps millimeter-wave experience but certainly something sustainable well in excess of 100Mbps.”
The Federal Communications Commission is working to make more mid-band spectrum available, but that won’t be lighting up any US smartphones for some time.
(Disclosure: I’ve done a lot of writing for Yahoo Finance, a news site Verizon owns.)
Some games entice you into playing them with loud marketing campaigns, sexualized cover art, or the promise of ludicrous over-the-top violence. But then there are games like Lorne Lanning’s Oddworld series—games that don’t lead with muscle- or bikini-clad heroes and defy easy categorization. Games like Oddworld tempt you into playing by promising a different kind of experience. There are guns and violence, sure, but the setting is strange, the plot is filled with gray, and the hero—well, Abe isn’t exactly sexy, or really even, you know, human.
But players who gave the original Oddworld a chance back in 1997 found themselves stumbling through a unique and fascinating world that was equal parts surprising and subversive, and the series has gone on to acquire legitimate cult-success status. With the approaching release of Oddworld: Soulstorm in 2020, we thought it was a good time to pay a visit to Lorne Lanning and his team at Oddworld Inhabitants, and talk about our favorite meat processing factory worker and his long journey from design notebook to screen.
“Write what you know,” they say…
We interviewed Lanning at the Emeryville, CA headquarters of Oddworld Inhabitants, the studio he co-founded with Sherry McKenna in 1994. For Oddworld fans, the office was a magical place, stuffed with the kind of memorabilia that amasses over more than two decades of game design. Lanning walked us through his journey to become a game creator, starting from his poor beginnings in what sounds like an unstable family. He got into video games because his father had a job at Coleco, and Lanning thought gaming would be a good way to meet girls.
Lanning’s ambitions weren’t aimed at the small screen—he had his eyes set on making movies. To pay the bills, he took a job at TRW Aerospace, where he worked on anti-missile defense systems (it was the 1980s, and Reagan’s Strategic Defense Initiative boondoggle was in full swing). His exposure to soul-crushing bureaucracy and supplier management formed the basis for many of the Brazil-seque ideas later presented in the Oddworld games.
But it’s the time Lanning spent at Rhythm and Hues Studios that had the biggest effect on Oddworld—at least the series’ collective look and feel. Working on visual effects set him on the path of visualizing game design in terms of cinema—not just how things were framed on screen, but also the discipline and budgeting style of the movie industry. When Lanning and McKenna (a fellow Rhythm and Hues alum) eventually started their own studio in the 90s, they approached their game and their character designs in the way Hollywood does. This obviously is de rigeur in 2019, but in 1995 when work on Oddworld started, it was most definitely not the industry norm.
When designing the first Oddworld game, Lanning and his team had to confront an annoying reality of game design—there are only so many ways to interact with the world in a side-scrolling action game, and a lot of those ways involve shooting stuff. And one of the immovable design goals of Oddworld was that protagonist Abe would go through the entire game without being armed—not because of any kind of political stance against guns, but because having Abe unarmed increases the character’s vulnerability in a world that’s already overwhelmingly hostile. A gun would provide an easy solution to many of the game’s problems, and where’s the fun in that?
It took some time to work out a solution, but Lanning the other designers decided that characters like Yoda don’t need guns to solve problems. They instead infused the game with a pastiche of mysticism drawn from a number of different sources, which gave Abe his secret weapon: the ability to possess other characters, including bad guys with guns. This let them then design in some puzzles involving shooting, which the player can solve by finding a bad guy, taking over his body, and having the bad guy shoot his way through the puzzle. To prevent the player from picking up the bad guy’s gun after the puzzle is solved, NPCs violently explode after possession.
An Odd(world) legacy
This video ended up being extremely long in the rough cut because Lanning gave us so much great interview material. We had to trim out quite a bit, but we’ll be producing an extended version if there’s enough interest in this video. There are several rabid Oddworld fans here at the Ars Orbiting HQ, and this video, like several others in the War Stories series, was a passion project with a lot of emotion invested in it (not to mention some custom voiceover lines performed by Lanning just for us!). We hope you enjoy watching it as much as we enjoyed making it.