Investors warm to software just a little bit
Dec 27, 2022
Some software stocks have had returns in the one- and three-month periods that defy the gloomy state of the software market.
Most of this year, software stocks have been among the bigger disappointments. Out of about five hundred tech stocks I regularly follow, most but not all of which are U.S. firms, the average price decline for the industry designated “packaged software” is just under fifty percent this year.
Out of forty-seven industry categories in the five hundred or so stocks, that sharp drop puts packaged software in the bottom twenty of industries. Semiconductors are down only thirty-seven percent in that time, by comparison, and the group of electronic components names, including Corning, has been among the strongest, down only fifteen percent.
The poor performance is not surprising given massive disappointments in the most recent earnings reports from names such as Twilio and Crowdstrike. Software sales have been under pressure for pretty much every vendor, whether they have met expectations or not.
However, in the past month to three months, software performance has brightened. The average decline for the packaged software group in the past month is just under six percent, better than the one-month average across all industries of an eight percent decline.
And while software trails on a three-month basis, down almost eight percent versus an average of all industries of down three percent, some software stocks have notched extraordinary gains in those three months, such as Nutanix, up twenty-two percent in three months.
Below is a list of the sixty software names among this group that have beaten the average for one-month and three-month returns.
Software names that have done better than average on a one-month and a three-month basis.
Why is this happening? It’s not tied to quarterly financial outperformance or underperformance. For example, GitLab, a company that has delivered results that defy the gloom in software, has had a weaker-than-average three months, down twelve percent.
But Dynatrace, which had to cut its outlook last month, nevertheless is up almost ten percent in three months, and has managed to be just a little better than flat the past month in comparison to that nearly six percent average decline.
Rather than hypothesize some over-arching theorem, I would suggest there are a variety of things going on. I think the “expectations reset,” as they say on the Street, where bad news is baked in, allows some buyers to creep back into a stock such as Dynatrace.
Dare I say it, some software investors may feel they’re getting bargains in some beat-down names. Confluent, for example, is a company that is not profitable, and won’t be for years yet. (Street consensus is that it loses money through 2024, and there’s no consensus yet about 2025.)
But Confluent has had better-than-average performance in three- and one-month periods. Its results have held up better than some other names, and after a seventy-two percent decline this year, Confluent trades for a multiple of enterprise value to next twelve months’ revenue of six and a half times. That is lower than the multiple of eight times just six months ago, and way below where Confluent traded at a year ago: thirty-four times revenue.
Perhaps that doesn’t sound like a screaming buy to you, but I can imagine that some folks must be eyeing the multiple contraction for stocks such as Confluent and thinking that if the estimates are correct, then the stocks are a decent buy relative to the exorbitance they once commanded.
The TL podcast for December 26th, 2022: Azenta is tops, pondering Absci, another awful Tesla week
Dec 27, 2022
Chip stocks continue to lose altitude but investors liked Azanta (AZTA), Tesla (TSLA) was once again a mighty dog, life sciences AI hopeful Absci, and predictions of AI doom from one of AI’s luminaries.
Solar bubble? Enphase retains pricey valuation
Dec 23, 2022
It’s not news that this year’s multiple energy crises, including the soaring price of gasoline in the first half of the year, and now the prospect of Europe starved for oil during the winter because of Russia’s war in Ukraine, have been a boon to the solar energy industry.
These are a gaggle of hardware vendors that include the bellwethers, First Solar and Sunpower, two panel makers that have been around since the previous solar boom, in 2007 or so; and some newcomers, including Enphase and Array Technologies, the former a maker of “inverters,” the latter a maker of “trackers” for solar farms.
As you can see in the accompanying table, the stocks of the group are up an average of thirty percent this year, pretty terrific. As interesting, their average stock valuation as a multiple of enterprise value divided by the next twelve month’s sales, has risen from one times a year ago to four of late.
One stock in particular stands out as having special momentum, if you will. Enphase, which came public back in 2012, has the second-highest revenue growth rate after Array, and unlike Array, it is profitable, with expectations it will make about $606 million in free cash flow over the next four quarters, according to FactSet estimates.
And Enphase has a perfect four-year record of topping analysts’ sales expectations, versus the rest of the group, which have done rather mixed in their reporting. I first pointed this out last year, when I wrote that Enphase was managing its business very well considering that it was struggling during the supply-chain mess to get parts to build its inverters.
Enphase builds what are called “microinverters,” parts that convert energy from DC to AC power in a solar energy system, and that do it in a “distributed” fashion, which has advantages over traditional inverters that are “centralized.”
Enphase has been riding the wave of home solar installation with the inverter product, and with a contractor model that lets a third party install the devices on the roof. Sales are expected to rise sixty-seven percent this year, to $2.3 billion. To be that size and growing that fast is impressive.
What strikes me in particular is the valuation. The company trades for a whopping fourteen times the next twelve months’ projected revenue. That kind of sales valuation makes it one of the most expensive stocks among the five hundred or so tech names I regularly follow.
In fact, in my collection, Enphase has the same multiple of enterprise value to sales as Snowflake, the cloud database software vendor. On some level, that kind of multiple makes sense because they are of similar scale and growing at similar rates.
Enphase is a little cheaper than Snowflake on a profit basis, at seventy-two times projected free cash flow versus ninety times for Snowflake. And its free cash flow yield, meaning, its free cash flow divided by stock price, is better, at 1.5% versus 1.10% for Snowflake.
That looks like a good deal, given that Enphase is doing better at generating real cash profits than Snowflake. Although Enphase has a lower gross profit margin, at forty-one percent versus Snowflake’s seventy percent, Enphase has a free cash flow margin of twenty percent, twice that of Snowflake.
And yet, it is rather extraordinary for a hardware maker such as Enphase to trade at the same or similar multiples of sales and earnings as a software maker. Typically, there’s a discount for the lower gross profit margin, and the greater risk, of a hardware maker. That greater risk is in contrast to cloud software sales, where there is a degree of certainty baked into the way software is billed and invoiced.
I’m inclined to think that the sixty-three percent run-up in Enphase this year is an effect of traders fixated on the energy crisis story, and that the stock has gotten ahead of itself.
But maybe uniqueness plays a role. Over the transom Thursday came a long report by Daiwa Capital Markets analyst Jonathan Kees, in which Kees initiates coverage of Enphase and First Solar with “Outperform” ratings, and a Neutral rating on SolarEdge.
Kees believes the outperformance of the group can continue, and Enphase and First Solar, in particular.
With respect to Enphase, Kees’s three principle points are that a) there is, indeed, an energy crisis, maybe temporarily with the Ukraine war but permanent given broader risk; b) Enphase is in a duopoly with SolarEdge for inverter sales in the U.S., and duopolies tend to be a favorable structure for protecting pricing; and c) the Inflation Reduction Act passed by the U.S. Congress is a windfall for further building.
Kees notes that the stocks have all become something of a momentum investment. “In our view, the sector has gone from high-risk and somewhat distressed to high growth and momentum,” he writes. “We believe the run over the last three years can continue even with higher interest rates as utility companies to homeowners can justify the higher cost of capital by raising rates or saving more money on utility bills.”
On the matter of energy security, it’s top of mind everywhere, he writes: “Geopolitical events like the Ukraine war have highlighted the vulnerability of energy security. The lack of global investment in fossil fuel development also can keep supply/demand tight and prices high, barring a recession.”
On the matter of duopoly, Kees writes that Enphase and SolarEdge have about ninety percent of the U.S. residential market for inverters. That’s thanks in part to the fact that the world’s dominant seller of inverters, China’s Huawei, is banned from selling in the U.S.
Enphase, writes Kees, has been doing better than SolarEdge within that duopoly: "Enphase has gained market share over SEDG in the US residential market. This favorable market has high growth and high margins.”
On the matter of the Inflation Reduction Act, it’s one of several initiatives he thinks will add fuel to the fire, so to speak: “We believe the sector is now entering a catalyst-rich environment as companies announce new manufacturing facilities to benefit from the US Inflation Reduction Act (IRA), and for order activity to accelerate during 2H23 owing to that policy, providing booking visibility into 2024-25.”
As far as valuation, Kees thinks the multiples on the entire group are “surprisingly reasonable” given how much the companies have to offer. His main defense of the stock multiples is that the stocks are “within historical range.” Meaning, Enphase’s historical multiple of sales over a two-year period is 13.3 times, not far from what it is now.
I suppose that’s true, though it ignores the fact the stock has been much cheaper many years ago. I still suspect some of the valuation is that momentum trade from a market that is enamored of alternative energy as a narrative at the moment.
I also suspect that some of the hot air that came out of Snowflake and other pricey software stocks went into these renewable energy stocks, especially Enphase.
Probably, as Kees suggests, as long as the results at Enphase continue to deliver, it will maintain the momentum, and that keeps the stock high.
As for the title of this article, it may not really be fair to call it a bubble, for one stock does not a bubble make. However, I suspect that regardless of individual valuation multiples, there is a bit of the momentum element in all of th names, as Kees suggests.
Micron: the auto market for chips is soaring
Dec 21, 2022
Update:
On the call this evening, Micron CEO Mehrotra told the analysts the company is navigating “the most severe imbalance between supply and demand in both DRAM and NAND in the last 13 years” and that pricing for chips, industry-wide, “deteriorated significantly” since the last earnings report.
Although the glut of chips sitting in inventory with customers is going to burn down to “healthy” levels by the middle of 2023, said Mehrotra, which will boost Micron’s revenue growth, he predicted that profitability in the DRAM and NAND business will “remain challenged” all year long for all industry participants.
On the positive side of things, Mehrotra gave analysts a taste of how some markets for chips may see improvement:
Automotive: Sounds the healthiest. Micron’s revenue for chips for cars was up thirty percent last quarter, almost as good as the record quarter prior to that. “The macro environment does create some uncertainty for the auto market, but we see robust growth in auto memory demand in fiscal 2023,” said Mehrotra. The growth in the demand for memory chips, by volume of bits, will be double that of any other end market for the next five years, predicted Mehrotra.
Flash-based solid-state drives: Had a "very strong” quarter, and Micron has leading technology for “QLC,” the highest-density form of NAND flash that allows for replacement of disk drives. This sounds like what Charlie Giancarlo, CEO ofPure Storage, has been saying about replacing disk.
Smartphones: After sharp declines in handset sales this year, things are going to rebound from China’s unlocking, says Mehrotra. “We forecast calendar 2023 smartphone unit volume to be flattish to slightly up year-on-year, driven by improvements in China following the reopening of its economy.”
Industrial: IoT and 5G base stations and other infrastructure are still good markets but are soft at the moment; they should rebound in the second half of 2023, said Mehrotra.
On the negative side, the biggest concern is not the glut of chips itself, it’s what it’s doing to prices of chips for Micron and others. Prices of chips declined by more than 20% for both DRAM and NAND last quarter.
Hans Mosesmann of Rosenblatt Securities, in a note to clients this evening, points out the pricing issue is going to last for about six months.
Mosesmann writes that “indications are that we are bottoming in terms of industry and company inventories.” But, writes Mosesmann, that just means that “we exit in the first half of calendar ‘23 the inventory “bit” problem and enter into a headwind centered around ASPs, which we see stabilizing entering second-half calendar ’23."
With tonight’s slide in the stock to $50.26, Micron shares are down eighteen percent since picked for the inaugural TL20list of stocks to consider.
Previously:
Shares of memory-chip powerhouse Micron Technology declined by one percent in late trading after the company this afternoon reported fiscal first quarter revenue and profit that missed analysts’ expectations, the third time in a row it has missed sales expectations and the first time in quite a while the company has missed profit expectations.
CEO Sanjay Mehrotra noted the company met its own forecast “despite challenging conditions during the quarter,” and said the company is on “solid footing to navigate the near-term environment,” adding, “we are taking decisive actions to cut our supply and expenses.”
Mehrotra continued, “We expect improving customer inventories to enable higher revenue in the fiscal second half, and to deliver strong profitability once we get past this downturn.”
The company’s forecast for revenue and profit this quarter was also disappointing, the third quarter in a row of weaker-than-expected forecasting.
In a companion deck of slides, Micron said it sees NAND flash memory chip sales coming in lighter than previously expected this year, changing its language for “industry bit demand growth” for NAND from “slightly higher than 10%” in its previous report to now “low to mid-single digit percentage range.” That’s not a big surprise for anyone watching the collapse of NAND pricing this year.
For 2023, the company thinks demand for DRAM will rise by ten percent, and for NAND, by 20% or so.
That is a moderation from the prior quarter’s statement, “For CY 2023, we expect demand growth to be closer to the long-term growth rate of both DRAM and NAND.”
Micron again cut its outlook for its capital spending, to a range of seven billion dollars to seven and a half billion this fiscal year, down from an already reduced eight billion in the prior report.
It said its spending on wafer fab equipment, or “WFE,” will decline this fiscal year by “more than 50% Y/Y,” which is sharper than the prior language, “nearly 50% Y/Y.”
The company said its own supply in DRAM is going down this fiscal year:
Expect Micron’s CY23 production bit growth to be negative in DRAM and up slightly in NAND. Full impact of the wafer start cuts will be realized beginning FQ3-23. Micron’s bit supply in 2024 will be materially reduced from the prior trajectory.
Here are both slides side-by-side, the most recent one on top, the prior quarter’s slide below it.
Micron’s latest outlook slide.
The prior outlook slide.
Micron indicated it will continue to have a hard time with some end markets for its products. For example, in clouddata centers, where server computers need powerful DRAM, the company stated, “Expect cloud demand in 2023 to grow well below historical trend, due to the significant impact of inventory reductions at key customers.”
And for personal computers, the company said in 2023, “PC unit volume to decline by low to mid single digit %.” That is not as bad as the “high teens percentage” decline in 2022.
Absci: Will ‘zero-shot’ AI change drug discovery?
Dec 21, 2022
“We're going to look back on history and realize that that was one of the pivotal moments that changed our industry,” says Absci founder and CEO Sean McClain, “when we put a de novo antibody that was designed on a computer in humans.”
The world was dazzled last month by the debut of an artificial intelligence program called ChatGPT, from the Microsoft-backed startup OpenAI. Posing questions in natural language via the keyboard, a person can prompt GhatGPT to give a full-paragraph answer to a factual question, such as, When did people first land on the moon?
But the same program can spit out endless reams of text, fulfilling much more ambitious queries, such as, Write a poem in the style of Walt Whitman about scuba-diving in Paris.
The wide-open nature of GPT is at the heart of its intrigue. The program seems to have such a broad nature that it prompts one to imagine all sorts of potential applications.
One afternoon this month, at the Manhattan satellite offices of Absci, an eleven-year-old, promising biotech firm, ChatGPT was being put to good use.
“We had a board meeting this week, and I needed something to open up with,” says Sean McClain, founder and CEO of Absci.
“We used ChatGPT to write a rap,” explains McClain, standing amidst the cubicles where the company’s AI team works.
“We asked it, Write a rap on AI drug discovery for biologics.”
“Here we go,” says McClain, nodding to a staffer who starts a beat track on his phone. McClain gives himself a three-count, then launches into the rap with gusto, swaying as he recites each line displayed on the screen in appropriate hip-hop swagger.
AI in the lab, making moves with biologics Fast and precise, our algorithms are prolific Discovering new drugs with efficiency and speed Leaving our competitors in the dust, with no time to bleed
The full poem goes on for several stanzas, as you can see in the accompanying sidebar. The staff gathered around on this Tuesday afternoon are delighted with their boss and give him a round of applause when he’s done.
“ChatGPT wrote that, it’s incredible,” McClain marvels.
He will later inform me, in a follow-up email, that his board of directors, at dinner the next day, upon hearing his rap, “suggested I stick with my day job rather than moonlighting as a rap superstar.”
But that’s alright. The playful diversion is merely meant as an illustration of what McClain believes is being heralded for life sciences by the increasing power of AI programs.
ChatGPT is just one example of a broad class of AI programs known as “generative AI,” applications that don’t just label pictures of cats and dogs, but that instead produce a new artifact in the world.
Today, it is rap lyrics, but down the road, says McClain, generative AI will generate formulas for novel biologics, or drug products, with highly specific powers tuned to the biology of illness.
“The exciting part is being able to be the first company to put a biologic, or an antibody, into humans that was designed on a computer,” says McClain.
“We're going to look back on history and realize that that was one of the pivotal moments that changed our industry, when we put a de novo antibody that was designed on a computer in humans.”
AI’S BROKEN PROMISES
That is a bold statement about drug discovery, and as a journalist, I’m wary. AI in life sciences has been a long time coming, and I’ve heard many claims over the years about its potential. Following McClain’s rap, we sit down in the conference room, and I have a chance to convey my skepticism.
> Write a rap on AI drug discovery for biologics.
OUTPUT:
AI in the lab, making moves with biologics Fast and precise, our algorithms are prolific Discovering new drugs with efficiency and speed Leaving our competitors in the dust, with no time to bleed
(Chorus) AI drug discovery, for biologics Revolutionizing the way we treat our illnesses With precision and power, we're making waves In the world of science, we're the crafters of new days
Our machine learning models are top of the line Finding cures for diseases, one at a time We're changing the game, with our cutting-edge technology Making strides in healthcare, for the betterment of humanity.
(Chorus) AI drug discovery, for biologics Revolutionizing the way we treat our illnesses With precision and power, we're making waves In the world of science, we're the crafters of new days
(Bridge) Gone are the days of trial and error We're using AI to find the right answer Bringing hope to those who need it most With our cutting--edge technology, we're taking the reins and we're on the rise
(Chorus) AI drug discovery, for biologics Revolutionizing the way we treat our illnesses With precision and power, we're making waves In the world of science, we're the crafters of new days
Journalists, including myself, have been like the boy who cried wolf, promising trusting readers about amazing breakthroughs with AI just around the corner.
And yet, to date, no drug has emerged from Phase III clinical trials that can uniquely trace its success to AI. None of the supposed efficiency benefits of AI have changed the economics of the industry.
Instead, the marketplace continues to see upwardly spiraling costs for drugs. Take the recently approved Hemgenix, from the U.S. pharma company CSL Behring of King of Prussia, Pennsylvania. Designed to restore precious clotting proteins to hemophiliacs, Hemgenix is the most expensive drug in the world, at three and a half million dollars per dose, according toNature.
Nor has AI solved the extraordinarily low success rate of drugs that get approved, about four percent of those that are attempted. Even the success rate for Phase I or II trials is stuck at about eighteen percent. For successful drugs, the time frame is still stuck at a decade or more from basic chemistry to Phase III trials.
It feels as if the promise of AI in life science, if not a broken promise, is one that has been extraordinarily over-hyped.
As I recite my chastened view, McClain nods. “I think that we are in the early innings,” he says. “We have shown fundamental advancements in the space, and actually showing, yes, this technology does what they say it can do.”
“That should translate into increased clinical success — but we're not there yet.”
The technology breakthrough to which McClain alludes is encapsulated in a research paper posted by he and his team on the free bioRxiv pre-print server in August. The paper, which has not yet been peer-reviewed, describes how the company used a neural network to predict whether a protein with a certain pattern of amino acids would be more or less likely to “bind” to an antigen — in lay terms, how likely it would be for the protein antibody to attack the pathogen in the body.
Like ChatGPT, the neural network in question is a generative AI program, in this case one introduced in 2019 by scientists at Facebook called RoBERTa, which anyone can grab off the shelf and play with.
The paper showed two remarkable results. One, RoBERTa’s predictions of which antibodies would bind were “highly accurate” compared to what could be measured in the lab by actually observing binding under a microscope. In other words, you could run the experiment on the computer instead of a lot of the lab work, potentially a huge time and materials savings.
More dramatically, McClain and team asked the neural network to invent novel combinations of amino acids by altering sections of known antibodies to create variants. Again, RoBERTa predicted how these new variants would bind and, again, the predictions of the machine were highly accurate compared to what lab results showed.
The point of that second step is that the computer using AI can run many more explorations of possible amino-acid variants than can be run through a lab where the actual assays, even “high-throughput,” take tremendous time and care to prepare.
Similar to how ChatGPT spits out whole stanzas of poetry, the Absci program is able to spit out reams and reams of amino acid variants. In the paper’s results section, McClain and team declared, “Deep language models can expand the search space of an experimental dataset by orders of magnitude.”
As McClain explains it to me, “For drug discovery, it’s essentially saying, Create me a drug that has these attributes — that is the future.” Pharma, he says, then goes “from drug discovery to drug creation, where you’re actually using AI to create novel drugs that don’t exist.”
FROM WET LAB TO AI
It has been a long time coming to this point. McClain founded Absci eleven years ago after graduating a year early from the University of Arizona, where he began as a mechanical engineer but then switched to molecular biology. His focus back then was not on AI, but on the art of protein expression.
McClain’s accomplishment at the start of Absci was a refinement of the mechanism by which E. coli cell lines in the lab can be made to produce enormous volumes of proteins. The cells become like little factories for producing custom proteins that a drug maker would want, such as monoclonal antibodies that can fight viruses.
That was a breakthrough in scale. A typical test tube of animal cells would produce merely thousands of antibodies. With McClain’s approach, “You could basically take a single test-tube of our engineered E. coli, take a billion different antibody sequences, and in that single test-tube you have a billion different drug candidates.”
Absci has patents and patent applications on that protein synthesis with McClain as the lead author. Because it’s rather like a production line for proteins, in an article about Absci for ZDNet earlier this year, I dubbed McClain the Elon Musk of protein manufacturing, a moniker he told me gave him some amusement.
With the ability to mass-produce proteins, McClain’s company was waiting for the proper vehicle to exploit that laboratory capability. The arrival of “deep learning” forms of artificial intelligence appeared on the scene as a perfect complement.
Deep learning is generally data hungry. “There wasn’t enough data” with the mammalian cells of traditional protein research, observes McClain. “We solved that problem.” A billion proteins in a test-tube, rather than thousands, meant there was suddenly enough training data to fit the power of those neural networks such as GPT-3, the generative neural network from OpenAI upon which ChatGPT was built.
“It was basically eleven years at Absci developing the wet lab technology that would allow us to leverage generative AI,” he says.
“The craziest part is, I had no idea that E. coli was going to be the key to unlocking data for biologics with generative AI.”
Shortly before going public in July of last year, Absci bought another startup, Denovium, a three-year-old firm pioneering deep learning AI in medicine. The company was using AI to tease out novel proteins from DNA sequences.
THE FEEDBACK LOOP
Hitched to McClain’s protein factory, the Denovium AI becomes a way to engineer a kind of feedback loop. One first manufactures tons of proteins, then sends the proteins into the AI software, as symbols of amino acid chains, and out come predictions from the program about binding. Those predictions are then sent to the wet lab to be validated in the test tube. The lab validation then becomes further data for the AI programs, and the process starts all over again.
Absci as a company functions a little like a feedback loop. The main headquarters where McClain built his wet lab is in Vancouver, but he flies out once a week or so to New York, where he maintains an apartment, to visit the AI hub on the 43rd floor of 152 West 57th, an imposing modern high-rise next to Carnegie Hall where we are having our chat.
What is discovered in the AI hub in New York becomes the input to the wet lab’s test tubes in Vancouver.
“The fact that we can go from wet lab data, to training the [AI] models, in a six-week time period, that’s what’s allowed us to make these huge advancements in a short amount of time,” says McClain. Other companies have wet labs, such as Recursion Pharmaceuticals, but that is for small-molecule drugs, not large, complex molecules such as antibodies, McClain points out.
The feedback loop has attracted some of the top talent in the field, including AI lead Joshua Meier, who was previously with Facebook’s AI team. Another top AI scientist recently joined from Tesla’s self-driving team.
“They joined because we spent ten years building that feedback loop, and that’s an advantage now no one else has,” McClain tells me.
That tight coupling, says McClain, puts the company ahead of other firms that don’t have a lab, that just use software and data.
“A lot of these papers that are coming out, they say, Hey, we developed this computational metric that got improved,” he says of competing AI efforts. “But they never showed that it worked in the lab.”
The lab is important for ruling out false positives, he says.
"Because we have that six-week cycle time that we have and no one else has, that's what's really allowed us to figure out what directions are important to go in.”
“Sometimes, we see the industry going in a certain direction on a metric that they think is important,” he says, “and we find out that that metric doesn't actually correlate to the wet lab and what we actually want it to do.”
Eventually, says McClain, it will be possible to do most of the work on the computer, in silico, as it’s called. He expects to get there, but it won’t magically happen overnight. It will happen as a progression.
“Ultimately, you're still going to want to validate everything you do” in the lab, he says, "but you're not going to have to validate it to the extent we do now.”
The AI software program developed by Google’s DeepMind program, AlphaFold, is, of course, an important advance in what can be done on the computer. AlphaFold has essentially solved the problem of how proteins fold, the problem of structure, in other words.
Structure is important, but it won’t solve the problem of an antibody binding to an antigen, says McClain.
“Structure is a part of the solution, it’s not the solution,” says McClain. “At the end of the day, you don’t actually care about the structure, you care about, Does the antibody bind to where I want?
“AlphaFold won’t tell me this structure will bind to this target at the affinity I want,” he says. “AlphaFold isn’t telling you how to design the right drug.”
TURNING THE TIDE FOR DRUG DISCOVERY
To my point about the unmet promise of AI, McClain is convinced the feedback loop will dramatically change the success rate in drug development.
“There's never been a technology out there that's been able to, let's say, take a brand new target and be able to design antibodies that hit every single epitope,” meaning, the part on the antigen to which the antibody has to attach. “And, then, to be able to go instantaneously into the lab and say which of these gives me the biology I want to achieve.”
“This is the huge game changer — boom! You can instantaneously know what’s going to give you the biology that you want.”
What’s more, not only binding but also other qualities can be predicted at the same time. In the August paper, the research showed that RoBERTa could predict what Absci has christened “naturalness.” Naturalness means how close is an antibody to naturally occurring antibodies. Greater naturalness can make an antibody easier to produce, and make it more effective against a target in practice.
“You’re not going to have this iterative traditional drug discovery process that takes years and, ultimately, gets sub-optimal hits,” says McClain. “We can get everything right the first time and dramatically reduce the time it takes to get into the clinic.”
By divining both the binding ability of a protein, and its naturalness, “You're no longer having to sacrifice different attributes for each other, and, kind-of, taking suboptimal hits,” says McClain. “You're able to take the optimal hit the first time.”
In AI, the ability of something like ChatGPT to spew out rap lyrics the first time you ask it, without practice, is called “zero shot.” Effectively, McClain is saying that his company’s AI models, in conjunction with the wet lab, will get so good, they’ll be zero shot at generating a good antibody.
“Again, it’s just like how I told ChatGPT to write a rap on drug discovery; we’re going to be able to do that same thing for biologics, feed in the target sequence and have the AI then give us an antibody with all the attributes we want.
“You’re not going to have this iterative traditional drug discovery process that takes years and, ultimately, gets sub-optimal hits,” he says. “We can get everything right the first time and dramatically reduce the time it takes to get into the clinic.”
The normal time to get from chemistry to clinic is four years. “We believe we can get that to about eighteen months,” says McClain.
Following on the success of the paper posted in August, McClain expects that “soon, very soon here, we're going to be releasing where we sit on on the de novo design,” he says, meaning, tailoring a drug from scratch. That may come at an investor conference, he says.
“Things are accelerating faster than we had anticipated,” says McClain.
WORKING WITH BIG PHARMA
Startups don’t generally do their own drug development, and Absci is partnering with multiple drug giants to take its AI and wet lab into the clinic, the most prominent partnership being with Merck.
“Our goal is to be in the clinic in 2024,” says McClain.
To do so, McClain has lured star talent.
A third hub for the company is in Zug, Switzerland, south of Zurich, where pharmaceutical luminary Andreas Busch runs the company’s “innovation center.”
Busch had been on the board for four months when McClain said, “We need you full time” to oversee the company’s work with the drug makers.
Busch has the important duty of bringing traditional Swiss cookie samplings with him on his trips to the New York office, which he puts out in the common area for all to enjoy. He finds New York fascinating, he tells me, but confesses the pace and complexity make him happy to return to the sleepy terrain of Zug.
He also has the important duty of being a steady hand who’s seen the full cycle of drug development. The sleepy Zug canton is, in fact, the locale of many Big Pharma companies, and Busch has helped to run R&D at many of them, including Sanofi, Bayer, and Shire.
“It’s incredible that we even landed Andreas,” says McClain. “He is one of the most prolific R&D, large pharma executives in the industry, I mean, he’s gotten over ten drugs approved, all the way from the bench, which, I think, is more than any other large pharma exec.”
McClain with chief innovation officer Andreas Busch, center, and chief AI officer Joshua Meier, in the company’s New York satellite office. The unique assets such as the wet lab have been a big factor in attracting “the best of the best” in talent, says McClain.
EXPLAINING THE ODDS TO INVESTORS
All of this research has to come to market, and what expectations to set with investors is a complex matter. Absci is growing very fast off of a very small base of revenue. The Street models sales doubling to a little over nine million dollars this year, and almost doubling again next year to eighteen million.
Absci releases quarterly press releases, but it has not held the traditional conference call with Street analysts since coming public. The stock is covered by a handful of analysts including Credit Suisse, Cowen & Co., and Stifel Nicolaus.
“We didn’t want to set a precedent of doing it [conference calls] because we’re not an earnings story,” says McClain.
All investors, including Fidelity Investments, the second-largest holder, says McClain, know that “It’s going to take time for revenue to ramp up.” In the breach, the right metric to watch is the company’s programs with Big Pharma. Those programs pay out in multiple ways, starting with up-front payments, followed by payments for milestones achieved, followed by, someday, royalties, assuming a drug succeeds.
So far this year, Absci is ahead of its intended goal of signing eight programs with pharma companies, having achieved ten, including three with Merck. The Merck deals, which carry the option of collaborating on three different drug targets, are valued at $610 million in milestone payments and eventual royalties. Meaning, up-front payments and milestone payments to Absci would be about $200 million per drug.
“Building up that portfolio [of programs], things get more advanced, and that's when you start getting that cascade of large milestone payments being hit, and ultimately, ramp up to royalties,” explains McClain.
The company doesn’t say how much the up-front payments are that are baked into each deal. “I will say it is a significant payment that definitely covers the cost of the work that will be done.” Revenue of $2.4 million in the most recent quarter was mostly from milestone payments by Merck, the company has said.
Given that it can take time for the total $610 million value of something like the Merck deals to be realized — if ever — Absci’s CFO, Greg Schiffman, will point out at investor meetings that the net present value, the discounted cash flows, of future deals is in the neighborhood of $15 million to $20 million per program.
“The way to think about it is, if we were today to go to a royalty farm that buys royalty streams, they would pay you today $15 million to $20 million for that particular program,” says McClain. “And so, if you did ten programs, that’s $150 million to $200 million of lifetime value that you’ve created; you’re not recognizing it today, but if you wanted to, you could go off and sell those.”
McClain is quick to add “But that’s not the business model,” meaning, selling the royalties.
As far as the ramp, while it’s highly dependent on what happens in the lab, and the AI hub, and the clinical path, it’s also tied to clinical success and commercial marketing if a drug happens to make it that far.
“You could think of it as, a good chunk of it is in-clinical development, and then we get royalties on top of that,” says McClain. “I would expect in the next three to five years, we will see revenues ramping up significantly, and really starting to create that hockey stick from a revenue perspective.”
Those milestones can come quicker if Absci can boost those dismal success rates of four percent overall and eighteen percent for Phase I and II trials. The bet McClain and team are making is that they improve those success rates, which would improve the payoff represented by net present value.
“All of our investors and analysts know that that's highly conservative,” he says of the $15 million to $20 million estimates. “Because with our technology, it should — and it will — in future increase success rates,” he says. “Even if you go from four percent to eight percent” success rate for drugs, he says, “that’s huge, even just on a net present value basis, that’s going from $15 million or $20 million to $30 million to $40 million — you double success, you double the NPV.”
PROOF OF CONCEPT
At the same time that it partners with Merck, says McClain, Absci will pursue some programs of its own. “We’re actually going to be developing our own pipeline” of drugs he says. “We don’t plan on taking these to Phase III, because that’s extremely expensive, but if you can take it to a proof of concept, let’s say, get an efficacy readout in Phase I, that’s a huge value to the asset.”
He predicts “we’re going to be able to get there faster than anybody else can,” meaning, getting to a Phase I. As for what those drugs might be, it won’t be a cure for cancer, he says. Rather, the orphan drug market, where disease cohorts are small enough that they usually don’t attract much investment, “is a really interesting area,” he says. He declines to say which indications those might be, but says the company intends to disclose that next year.
Getting to a Phase I trials with its own drug, and proving the Absci feedback loop, “gives you huge credibility, huge validation that the platform works, and that’s going to be driving even more partnerships our way,” observes McClain.
While oncology is off the table for the moment, McClain does allow as the topic is an intriguing one.
“The issue with oncology is that everyone’s going after the same known targets,” he says. “What we need to do is actually find new targets.”
PATENTS PENDING
If the company finds new targets for anything, it opens up a whole other aspect of the business: patents. The more that drug discovery turns to drug design, the more that Absci may be able to establish patents on both antibodies and drug targets, says McClain.
“We have very broad IP here,” says McClain. That includes both the wet lab technology of protein expression and whatever is developed with AI.
“If you can take a novel target, use the platform to develop antibodies against all the epitopes [locations on the target], then that enables you to make patent claims that you couldn’t have enabled in any other way.”
The dynamic of designing antibodies and linking them to the antigen becomes a kind of circular dynamic that establishes exclusivity, says McClain.
“You give yourself a big runway to go after that target where no one else could come in after you because you’ve defined the target as a function of the sequence variety [of the antibody] that goes after it.”
The company’s general counsel, Sarah Korman, who had been the head of IP and licensing at Amgen, was lured to Absci in part because of the patent prospects, McClain tells me.
“She saw the diversity that our AI models could create, and how that could actually enable very broad patent claims that previously were unattainable,” says McClain.
“Broad claims means you can block other people, and ultimately, kind-of, dictate who can come into a target.”
A power, no doubt, one must wield carefully, I offer.
“No, absolutely,” replies McClain. “You have to always remain, What is best for patients? How do we do what’s best for patients, and make money, and create shareholder value?”
THE BEGINNING OF THE ROAD?
If, as he says, Absci gets into the clinic in 2024, and if its AI models are really “zero shot” drug development machines, how soon will it be clear that this whole approach is going to make good on the promise of AI? Is it ten years down the road? Is it more than that?
“No, I think you’re going to see clinical proof of concept way sooner than ten years,” says McClain. “Even being able to get a Phase I efficacy readout, so you can actually show something that gets people excited, and then you go to Phase II and show that proof of concept.”
“This goes back to having somebody like Andreas on board that really knows the clinical development side, knows where to look for targets that could give us early efficacy signals in a phase one.”
It is still early innings, McClain reminds me again, as we wrap up. “I think the public needs to know that,” he says. “But, early innings are exciting.”
I’m reminded of cancer biologist Robert A. Weinberg’s great book, Racing to the Beginning of the Road. Weinberg wrote that the 1970s and 1980s were the decades that taught scientists the mechanism of cancer, why cells go rogue, why programmed cell death fails to rein in chaos.
Looking back from the 1990s on those decades of fitful, meandering research, Weinberg declared, hopefully, “after so long, we finally know where to look” for a cure.
Perhaps after years of work in wet labs and in AI, McClain and others have the tools they need to begin to make serious breakthroughs.
As we walk to the elevators, McClain, Absci’s single largest individual shareholder, with just under ten percent of the stock, tells me, “I think this will show you my bullishness: in eleven years, I haven’t sold a single share of stock.”
Absci shares, at a recent $2.38, are down seventy-one percent this year, and down eighty percent since IPO.
Nvidia: buy it, it’s troughed, says Needham
Dec 20, 2022
After a forty-four percent decline in shares of Nvidia this year, expectations for the chip giant have been thoroughly washed out, and now is the time to buy the stock, writes Needham & Co. analyst Rajvindra Gill in a note to clients Monday.
“NVIDIA is our Top Pick for 2023, and we are adding it to the Needham Conviction List,” writes Gill, while raising his price target on the shares to $230 from $200, and reiterating a Buy rating.
Estimates for Nvidia, notes Gill, have been cut by fourteen percent this year for the company’s revenue in 2022, from $31.2 billion to $26.9 billion, and the earnings per share estimate has been cut by thirty-four percent. (Nvidia’s fiscal year actually ends in January, but Gill uses calendar years to simplify the matter.)
For 2023, estimates have been slashed by twenty percent for revenue and thirty-one percent for EPS.
Gill argues that the company’s slump in sales of video game cards is about to come to an end, writing “Gaming revenue has bottomed (inventory cleared exiting this year).”
You’ll recall that Nvidia cut its expectations in August because of rising inventory of GPU chips brought on by slowing sales in the video game market.
Following that, CEO Jensen Huang told the Street last month that Nvidia is finally finding its footing in the gaming market. “We are quickly adapting to the macro environment, correcting inventory levels and paving the way for new products,” he said at the time.
Meantime, writes Gill, the data center market, which is actually the larger category of product for Nvidia “remains on a solid footing,” excluding the weakening of sales into China’s data centers.
The data center market, moreover, is extra attractive given that Nvidia is dominating sales of “GPU-based accelerator cards” that speed up server tasks. Citing data from Gartner, Gill writes that such cards have an “attach rate” of sixteen percent, meaning, sixteen percent of servers sold sell with one of those cards. “We believe NVDA dominates the market here; corroborated by our Top500 analysis” of supercomputers, he writes. That dominance, he argues, is prompting customers to upgrade to Nvidia’s latest and greatest GPU for data centers, the “H100,” or “Hopper” chip that came out this fall.
See also:
Nvidia CEO Huang: cloud expands the company’s reach into enterprises, November 15th;
Nvidia’s forecast in-line with street, says ‘quickly adapting’ to global economic slowdown, November 15th.
Last, the valuation has come down for Nvidia, even if it’s still pricey. The multiple of enterprise value to sales for calendar 2022 estimated sales is 15.4 times, which is down thirty-five percent from what it was at the beginning of the year. And the multiple for next year’s estimated sales is 14.3 times, down twenty-nine percent.
“Nvidia’s multiples have come down significantly since the start of the year, alongside its stock price,” writes Gill.
That means Nvidia’s stock is less risky than the broader market, he opines:
Market multiples have expanded meaningfully since the October bottom. We posed the question “is this the bottom another head fake?” in our recent earnings review. We expect the first half of CY23 to remain choppy as estimates likely need to come down further. Yet for NVDA, we think estimates are much closer to the bottom. We believe the shares can trade closer to their pre-COVID levels as both Gaming and Data Center growth accelerates in 2H23.
Nvidia shares, despite Gill’s positive missive, closed down two percent Monday.
Nvidia is one of the TL20 stocks to consider. Its shares are up three percent since the TL20 was inaugurated in July.
The TL podcast for December 18th, 2022: Chip stocks break their winning streak, and what ails Tesla
Dec 19, 2022
News of the Fed Reserve half-a-point hike on Wednesday sent everything into sharp decline, but chips stocks, in particular, have reversed their recent winning streak. Tesla (TSLA) was one of the biggest losers of the week, and Elon Musk could fix what ails the stock if he’d make more disclosure about metrics around car sales.
Tesla: Musk needs to give some Metrics
Dec 18, 2022
A week ago or so, I received a tweet in my Twitter feed from Elon Musk, informing me that Twitter’s average load time has improved by four hundred milliseconds. I doubt I noticed the difference, and I doubt many other Twitter users did either.
Frankly, while Musk is re-arranging the deck chairs at Twitter, I suspect most investors would be a lot more interested to hear more from him about measurement at the one company of his that has the most prospect of actually being a great business, Tesla.
Tesla has been a real dog for the past four months or so. I refer to that amount of time because it is the time since inception of TL20, The Technology Letter Twenty, the list of twenty great companies to consider investing in. Tesla is a great company, and it’s one of the TL20. It could be a great stock, but it’s been terrible since the inauguration of The Twenty.
Tesla is the worst performer since the July 15th inauguration date, down thirty-seven percent. Because the TL20 is market cap-weighted, Tesla exercises a substantial effect upon the group. The entire TL20 is down 12.6% since inception, but excluding Tesla, it would be down just 2.2%.
By that alternate measure, the TL20 would be ahead of the 6.5% decline in The Nasdaq Composite Index since July 15th, and it would be a lot closer to the Standard & Poor’s 500 Index, which is roughly flat since July.
What will stem this decline? What might help is if Musk were to share more data about how Tesla’s sales prospects look. The data the company traditionally discloses is vague, and has lately become very cloudy and hard to interpret.
That is happening at a time when there is rising anxiety that Tesla’s sales may be about to fall apart. Tesla is a bubble stock, and to an extent, a bubble company, and it has never been tested in a prolonged economic downturn.
Founded in 2003, Tesla came public June 28th of 2010, after the last major economic contraction, The Great Recession. Its vehicles, which list starting at just under fifty thousand dollars for the base Model 3, represent pricing in a time of relative economic prosperity.
In an economic crunch, even some bulls think Tesla will have to cut prices. An Uber-bull on the stock, Trip Chowdhry of the boutique Global Equities Research, writes in recent missives to investors that both Tesla, and competitor Lucid Group, “will need to drop vehicle prices by at least 10% to 15%” because neither is “immune to recession.”
If it is true that Tesla is now entering its first real test of demand, it might be good to have a more solid measure of demand. That’s where metrics come in.
Tesla’s financial reporting has to date consisted of a two-step: deliveries and revenue. Tesla recognizes revenue on cars only when the customer takes possession of the vehicle, the delivery. The company reports total car deliveries shortly after a quarter is over, and two weeks later, it reports revenue. In between, the Street writes predictions about revenue using deliveries as a leading indicator.
That kind of very simple calculus is a thing that works just fine when business is going up and up in good times, much as Netflix, another bubble company, for a long time dazzled the Street with rising subscriber numbers — until it stopped growing.
For Tesla, deliveries have become problematic as a reflection of anything of late because they’re under pressure from factors unrelated to demand.
Tesla, like many firms that make real, physical stuff, has been dealing with the supply chain issue. As a result, deliveries lately are not keeping up with expectations. Musk, and CFO Zachary Kirkhorn, have set a goal to increase both production of cars and deliveries, on a unit basis, by fifty percent, annually, over a multi-year time horizon. Production rose fifty-four percent last quarter, but deliveries rose only forty-two percent, and Musk and Kirkhorn said deliveries will continue to be under pressure.
The Street expects deliveries will still be under pressure in 2023, forecasting total delivery growth of just forty-four percent next year.
There’s a second reason deliveries are cloudy, and that’s the rising backlog of cars not delivered. The company has been raising prices this year to offset rising costs, not just materials costs but the rising cost of freight.
Consequently, cars in backlog, once they are delivered and recognized as revenue, will probably skew Tesla’s average pricing upward irrespective of demand. That can cloud Tesla’s true pricing power, quarter to quarter.
If deliveries aren’t a great indicator of demand, Tesla’s actual commentary about demand is vague.
Musk and Kirkhorn typically talk about how the “order book” is doing in broad terms. On the October earnings call, Musk remarked that, “demand is a little higher than it would otherwise be,” without elaborating.
It would be nice to have another measure of demand. The balance sheet provides some extra data, but not much. The deferred revenue balance includes many parts of Tesla’s offering that have nothing do with the vehicle sale itself, such as the company’s self-driving software and system software updates.
The other balance sheet metric that’s slightly relevant is the figure of Tesla’s customer deposits, which is the amount a customer has to put down when they place an order, such as the $250 deposit for a base Model 3. But that figure also includes deposits for Tesla energy products, so it’s not a clean auto number.
And the deposit amount per vehicle is highly variable based on the model and configurations. To my knowledge, no one has triangulated how customer deposits correlate to aggregate car demand for Tesla in any given period.
So, there is no good measure for demand, other than Musk’s upbeat tone — that, and the company’s past performance of generally increasing sales nicely over many years.
Other areas of tech have come up with additional measures to reassure investors. The prime example is the software industry.
A year ago, I wrote a broad overview about how the software world has sprouted numerous measures of the business quarter to quarter. The Metrics, as I term them, include tons of non-GAAP numbers such as “remaining performance obligation,” a measure of the total value of software contracts signed that has yet to be realized as revenue.
If Musk and Kirkhorn had any desire to reassure the investing public, they could disclose a similar sort of measure. For example, what is the total order book value in dollar terms? How much of that might be realized in a given period, the current portion of the order book, would be a nice complement.
Based on the remarks and tone of the conference calls, I don’t get the sense Musk and Kirkhorn have any urgency to provide such reassurance. And I expect they don’t want competitors to know those kinds of things.
That means investors will have to decide: Is this a company that’s priced too high for a recession, or will its position as an EV leader prove more durable than people suspect?
Even in a tough market, if prices fall as Chowdhry expects, Tesla would still be the best house in a bad neighborhood, as it is far ahead of the competition in making product.
I wrote last month that the crop of young contenders are a mess. Lucid, Rivian Automotive and Faraday Future have continued to miss expectations as they struggle to get to volume production. All three, moreover, are pricing their wares at the high end of the market, so none of them are a budget alternative.
Ford and others can be a budget alternative, buttheir progress in EV sales still leave them far behind Tesla. A report in November by S&P Global Mobility stated that of 525,000 electric vehicles registered in the U.S. in the first nine months of this year, sixty-five percent, 340,000, were Teslas.
Ford has sold a total of 53,752 electric vehicles this year, making it number two behind Tesla, according toThe Detroit News’s Jordyn Grzelewski. While Tesla doesn’t report regional numbers, you could say that based on deliveries of 343,830 cars, worldwide, in the third quarter alone, there is good reason to believe Ford is a very distant second place.
If Tesla isn’t recession resistant, but if management won’t reassure investors with additional disclosure, then at some point, the prospect of share buybacks will probably become one of the most important parts of the story.
Musk during October’s earnings call indicated a big buyback is a distinct possibility.
“We've debated the buyback idea extensively at board level,” said Musk. “The board generally thinks that it makes sense to do a buyback.
"Even if next year is a very difficult year, we still have the ability to do a $5 billion to $10 billion buyback,” said Musk.
“This is obviously pending board review and approval, so, it's likely that we'll do some meaningful buyback.”
Procore CEO: Taking the construction business from ‘analog’ to rational
Dec 15, 2022
Having watched a few companies fall under the weight of their own promises over the years, I’m pleased anytime I hear a company’s management being circumspect in how it talks to the Street. It is absolutely always better to under-promise and over-deliver.
“I wanted to stress that these new opportunities are still in very nascent, kind-of, sapling days,” says Craig “Tooey” Courtemanche, CEO of Procore Technologies, in a chat he and I had this month via Zoom.
Courtemanche is talking about his company’s first analyst day meeting since coming public in May of 2021, which took place last month. The analyst day is the annual ritual when tech companies regale the Street with tales of future product and future financial success.
It’s a day, in other words, often laden with a certain amount of puffery, and so it’s an especially dangerous time for promising too much.
Courtemanche used the day to talk up some planned enhancements to Procore’s software. Procore makes programs that can streamline the challenges of the construction industry, selling to thousands of general contractors, sub-contractors and project owners.
At the analyst day, he talked up future offerings, specifically, a financing and an insurance component. In the former, Procore will purchase materials directly from originating suppliers, with the help offinancial partners, and then resell them to the customers. In the case of insurance, the company will connect the customer to competitive policies, again through partners.
It’s a thrilling prospect, potential “FinTech”-style businesses, ones with potentially huge incremental profit. It is also a bold attempt to re-write what construction is as an industry: to turn it from an “analog” business where things get decided over games of golf, to one of rational data analysis.
As exciting as all that sounds, Courtemanche took care to emphasize to the Street during his presentation that the saplings, as he refers to financing and insurance, are small projects in incubation, not yet prime time.
“I wanted for them not to over-index on them in the short term and put them in their models,” he says, meaning, the spreadsheet projections of Procore’s future that financial analysts construct.
And that’s just fine, actually, because the existing business at Procore is working nicely.
PROFIT PROMISE
The fiscal third-quarter earnings report, delivered November 2nd for the quarter ending in September, was the sixth quarter in a row the company beat revenue expectations, and the fifth quarter out of six that showed a smaller-than-expected net loss. And the forecast for this quarter’s revenue beat as well, the third time the company has raised this year’s revenue outlook.
All that at a time, mind you, when many, many software vendors have had to reduce their outlook.
Nor is Procore a tiny shop. It is on track for $800 million in revenue this fiscal year, increasing at a rate of thirty-four percent, and the Street is modeling that going to $970 million next year, so it’s a sizable software business.
Procore is not yet a profitable business, however, and likely won’t be until 2026, according to Street consensus. Courtemanche has told me in our prior conversations that the company’s commitment to investors is to “continue to show progress to cash-flow breakeven.”
On that score, CFO Paul Lyandres, during the analyst day, didn’t promise specific deadlines, but instead showed a rather complicated slide that represented how profit margin improves over time. His point was that Procore has demonstrated an ability to narrow its negative non-GAAP operating profit margin by three to four percentage points a year over a multi-year period.
“So, it might not happen every year in a consistent fashion, but that's the right trend line,” said Lyandres.
Free cash flow, he said, will “mirror” that improvement in operating margin in coming years, maybe even “slightly better improvement.”
CFO Paul Lyandres, during the analyst day, didn’t promise specific deadlines, but instead showed a rather complicated slide that represented how profit margin improves over time.
WHAT IS THE ‘PLATFORM’?
In our chat via Zoom, my agenda with Courtemanchewas not the current financial picture, but this notion brought up during the analyst day that the software is becoming a more complex collection of functions, including the forthcoming FinTech stuff.
Numerous times during the analyst day, Courtemanche used the term “platform.”
“What is that, and what does that mean to you?” I asked.
“The way we think about it, is, a common data environment that we build on top of so that everyone can be assured that they're working off the same set of information” when something is being built, is Courtemanche’s definition of the platform.
The Adobe version of platform software includes Photoshop and Illustrator, etc. In the case of Procore, prior to 2016, the company sold just one function to general contractors, project management software. That has now sprouted multiple different “apps” such as for bid management and invoice management.
“Our customers value us because we manage estimating to bidding to contract management to change orders, invoicing, lien waver management, and now we're going to launch Procore Pay,” a new function going live sometime later in 2023, that will streamline how a general contractor pays a sub-contractor.
“So, this is a very complex workflow that they deal with every single day, and once we have Pay and we're in the flow of funds, and people look to Procore as, kind-of, the funds control for their projects, we then think that that exposes a whole bunch of these novel business ideas on top of that that we can explore.”
Prior to 2016, Procore sold just one function to general contractors, project management software. That has now sprouted multiple different “apps” such as for bidmanagement and invoice management, and what the company terms a platform, a means for itself and partners to continually add new functions.
Most intriguing is how data collects in that platform. The general contractor and the project owner and the subcontractors all have various information they work off of about dates and materials and payments and specifications, and all the other stuff about a project, says Courtemanche.
“That’s very valuable, to Procore but also to our customers, because we can give them insights and predictive analytics around projects,” he says.
“Imagine if you're a contractor and you're building a hospital,” offers Courtemanche.
"If Procore can pop up in the middle of your job and say, ‘Look, based on the number of RFIs [request for information, a planning tool] that you have of unapproved submittals, you have a twenty-five percent chance of losing money on this job, you should try to do A, B, and C’ — and we know that because we have so many hospitals on the system that we can look at it and do cohort analysis — that becomes very, very valuable to a contractor,” he says.
DATA SCIENCE FOR CONSTRUCTION
Quite valuable, potentially, because the data becomes the basis of the FinTech services he is working on, one known as “Mat Fi,” materials financing, where Procore and banks will lend construction firms money for materials; and a future insurance product, where Procore and banks will sell contractors and their collaborators project insurance.
“This is a very complex workflow that they deal with every single day” says Courtemancheof the expanding software functions from estimating to bidding to contract management. “And once we have Pay and we're in the flow of funds, and people look to Procore as, kind-of, the funds control for their projects, we then think that that exposes a whole bunch of these novel business ideas on top of that that we can explore.”
“Those are the types of things that we're trying to build now with our data scientists and our insights team, is to be able to deliver actionable insights,” he says.
“We can put together risk profiles which enable our businesses around Mat Fi and Insure because we know where risk lies because of all the data.”
The data gathered from all these customers is anonymized, says Courtemanche. Customers still own their own data, and it is not leaked, he pledges. Instead, averages are gathered that can be meaningful across the whole customer cohort.
“Customers are very grateful that we can take the averages of different data elements and present them back to them and show where risk lies on a project that they're running,” he says.
NO MORE GOLF GAMES?
How far can that go, I wonder, because construction has always had its way of doing things, and data science is not the first aspect that comes to mind.
In such established fields, some of the “ecosystem” doesn't want to become more rational, I point out to Courtemanche.
True, he says. “I'll tell you, in today's world, general contractors will select their sub-contractors based off of who they played golf with last — it’s not a data-driven decision.”
But, says Courtemanche, the industry is actually “really desperate” to do things smarter. “You could be a sub-contractor on a job, and you could be winning big time on that job, but if the job itself is failing because of other sub-contractors, you're about to lose, right? So nobody can win in isolation.”
The data platform he’s building, he insists, will take the industry “from analog, where they're playing golf, to actually making good business decisions with the data.”
Another example, he says, is the half-a-trillion-dollars worth of “re-work” that goes on annually in construction, stuff that should never have been done to begin with. “That brand new air handler should never have been put into a building, ripped out two days later, and put into a landfill,” he explains, a source of enormous cost over-runs and just plain waste.
“We believe we can have a massive impact on that because people will be working on the single source of truth, the right information” about a project. “You won't have as much of that waste, which should, by definition, drive down either the cost of construction or drive up the margin of people's businesses.”
EVERYONE INTO THE POOL
The final element in the platform push by Procore is the company’s rather unusual practice of letting as many people use the software as are involved in a project. Most software vendors charge a seat license, or, these days, a seat subscription, for a particular user. Procore instead has an “unlimited use” policy: contractors, their sub-contractors, and lots of others can use the same Procore program licensed to the contractor.
The point, says Courtemanche, is that is how the software spreads.
“Let's say we sell to your general contractor, you become a customer, you upload your directory [into the program]: all of your sub-contractors are now in Procore and actually on the jobs in which you're using them,” he explains.
Those collaborators are “engaged in Procore on a daily basis, not paying us anything, but they're basically in your account, finishing their daily tasks, and we can tell those folks, ‘Hey, you work with lots of general contractors and across lots of different projects, wouldn’t you want your own account to manage your bids and your current projects and your work?’”
It’s a try-before-you-buy thing, and although there is not exhaustive buying data, says Courtemanche, “we do know that the heavily engaged collaborators have a high propensity of becoming customers.” New customers tend to be previous Procore users by a wide margin, he says.
The Procore Pay product going live in 2023 will be the first test of the more ambitious platform strategy. Procore has partnered with Goldman Sachs to run that payments system for customers.
The forthcoming materials financing bit and the insurance bit are works in progress. “We are now growing [Mat Fi] into slightly larger than a small business, but it's still pretty small,” he says.
Ultimately, Mat Fi and Insurance will involve partnerships as well. Procore “have a short list and we have definitely have not chosen one” yet for financing. For the moment, “we are still operating off our balance sheet” as far as lending money in beta mode, he says, “but it's a very, very small amount of capital, so it's inconsequential today.”
Shares of Procore, at a recent $52.88, this year are down thirty-four percent, and are up five percent since the third-quarter report.
GitLab CFO: We’ve executed really well in a tough market
Dec 14, 2022
I generally think I can pattern-match pretty well with respect to corporate earnings, having reported on them for decades.
Take, for example, GitLab, a young software vendor that came public in October of last year. It sells a set of software capabilities that are woven together that serve application developers. On the simplest level, it is what is called a a “version control system,” a repository where coders put their pieces of code that tracks which is the latest version, who has what, and other such functions.
The company a week ago reported their October-ending fiscal third quarter results, and it was the fifth quarter in a row since going public that the company has beaten expectations — by a very healthy margin, I should add — and the fifth time its forecast was higher.
In a case like that, where a company seems to be defying gravity, my sense of pattern tells me some on the Street might just start to expect the company has to stumble at some point, to have a sophomore slump. No company can keep a streak going forever.
“That’s fair,” says the company’s chief financial officer, Brian Robins, when I propose my theory to him during a Zoom interview last week.
There were, he admits, “watch points” in the quarter, things to keep an eye on. Specifically, “there's starting to be a little bit more deal scrutiny,” last quarter, by which he means, “there's a higher level of sign-offs required, basically, to get some deals done.”
More generally, he adds, “the second quarter, we weren't feeling any macro when a lot of companies were,” meaning, the broader economic distress didn’t touch GitLab’s business. “We felt that more in third quarter,” just ended
Some of the company’s expansion business with its existing software customers, when companies license more “seats” to use the program, was not as high as it should have been, based on GitLab’s statistics of its customer “cohorts.”
“It did have a material impact,” he says of the economic pressure, meaning, GitLab left money on the table last quarter.
Still, none of that is hurting reported results, obviously. GitLab is holding it together while other companies are trimming their forecasts.
GitLab isn’t stumbling by any means. My pattern recognition, in this case, seems to be off.
“You know, we executed really well in a tough market,” says Robins. “We delivered sixty-nine percent year-over-year revenue growth; we beat consensus by seven percent [for revenue] when a lot of companies aren't even beating, they're missing consensus; we did a small raise on top of that, so we had a beat-and-raise quarter.
“And then we also gave our soft guidance for fiscal year 2024, and also said that we're targeting to be cash-flow breakeven in fiscal year 2025.”
The term “soft guidance,” in this case, refers to Robin’s preliminary assessment that next fiscal year, the company will probably see forty percent revenue growth, about in line with consensus. With this, GitLab reassured analysts.
If things are not breaking down at GitLab, contrary to my pattern-matching hypothesis, what is happening?
“I like to try to do pattern recognition, and I came up with a hypothesis,” says Robins. He has told me in past that GitLab is the “best-prepared” among tech companies, doing assiduous research before, during, and after earnings season.
“You know, there’s been two hundred and ten thousand tech layoffs this year, and forty percent of that happened in the third quarter,” he says. That was, he notes, just around the time Meta, Google, Amazon, Twitter, and other firms were announcing a bunch of giant layoffs. “It almost felt like there’s this sentiment in the market, where executives said, holy crap, where is this economy?”
“It reminded me of when COVID first broke out,” he says, back in February of 2020. “A big drop in the stock market, and everything just, sort-of, seized up.”
Business for GitLab has not, however, seized up. Although the expansion deals were held back last quarter, the company had a stellar quarter for what it calls “first orders,” when first-time customers make their first purchase. The first orders, referred to as “new bookings,” rose seventy-five percent, year over year.
New orders should have been performing about in line with expansions, he had expected, but, instead, they did much better.
“To me that is awesome in the sense that, you know, people are coming on the platform,” says Robins. “I think the economy is really helping push people towards doing more with less, greater collaboration, get more efficient, show an ROI [return on investment] — and so, I was super-happy with that.”
You can’t artificially pump up sales results, says Robin’s. “You can't unnaturally grow way faster than the market, and you’re not going to unnaturally grow way slower than the market, but there's a range that you can grow in, and you just have to make sure that you invest accordingly to that range."
To Robins, the fact new customers are coming in the door at a time of economic uncertainty, when repeat business is harder to sign, is evidence of his conviction, which he’s told me before, that GitLab’s programs are “a mission-critical platform,” so much so that “we have some resiliency to the broad macro markets and things that are happening.”
To play devil’s advocate, I ask Robins if his firm should step on the gas, in Street terms, meaning, spend bigger to apply even more sales and marketing effort to win business when other vendors are struggling.
You can’t artificially pump up results, is Robin’s reply. “You can't unnaturally grow way faster than market, and you’re not going to unnaturally grow way slower than the market, but there's a range that you can grow in, and you just have to make sure that you invest accordingly to that range."
“The feedback there is, we've been really consistent,” in how the company talks to the Street, says Robins. “Our number-one objective is to grow, but we'll do that responsibly, and so that's why we've grown into improved operating leverage in the model.”
The operating leverage, in this case, is adding forty-six million dollars more in “incremental” revenue last quarter, with $2.3 million less in operating loss.
Last quarter, operating profit margin, while still negative, improved by a whopping seventeen percentage points, year over year. It is likely that in 2023, GitLab will add something like another one hundred and seventy milloin dollars in revenue, based on Street consensus, even as costs go down, for more operating leverage.
Does the investor pattern-match on all that’s working here? Do they, as I had suggested, fear some kind of stumble for GitLab?
It doesn’t sound like it.
“In three days, I spoke to thirteen analysts and over fifty investors” last week following the report, says Robins.
“When you got to the investors, the questions were all over the board,” he recalls. “I think you know you had a pretty good quarter when there’s not, like, three or four key themes everyone is asking about.”
GitLab stock this year is down forty-three percent, and it is off fifty-two percent since the IPO.
Oracle’s growth speed-up continues, touts gigantic cloud customers such as Nvidia
Dec 13, 2022
Shares of software giant Oracle rose by about two percent in late trading Monday evening, as the company reported another quarter in which revenue growth sped up, echoing the prior quarter, with the focus heavily on how the company is winning new customers for its cloud computing business.
Founder and CTO Larry Ellison rattled off names of “big customers” who had moved to using the company’s cloud service, including Fedex, DeutscheBank, and the Tokyo Stock Exchange.
"We're the only ones running a major stock exchange” of all the cloud service providers, said Ellison.
Given a rising backlog of business, “we expect our infrastructure business to continue to grow very, very strongly into the future.”
The forecast for this quarter’s revenue, in addition, was ahead of the Street, with a projected revenue range of $12.3 billion to $12.5 billion versus consensus of about $12.28 billion. That’s stronger than the forecast offered in September.
Oracle is the start of the earnings season. Although it trails other reports and is one of the last companies to report, it closes the books faster than other tech companies and so it is reporting not on September or October results, as most companies have been, but the quarter ended in November.
Sales once again topped expectations, even though foreign exchange continued to hamper reported sales growth, reducing the top-line number by six percentage points. The “constant currency” growth rate of revenue would have been twenty-five percent, though reported growth was a still-healthy eighteen percent. That figure includes one and a half billion dollars of revenue from Cerner, the health care information systems giant that Oracle purchased in June for twenty-nine billion dollars.
CEO Safra Catz noted the company’s continued revenue speed-up: “Even excluding Cerner, total revenue grew 9% in constant currency,” she said. "That's higher than Q1, and on top of a revenue beat this time last year,” she added.
Catz reiterated an expectation that the company’s cloud computing business will rise faster this fiscal year ending in June, stating, “our business continues to accelerate, we expect organic growth for our fiscal year 2023 Cloud revenues will be over 30% in constant currency.” The term “organic” here means excluding the portion attributable to Cerner. Last fiscal year, growth was twenty-two percent, and Catz had said in June, on the fourth-quarter call, that the growth rate would pick up.
Analysts seemed clearly delighted with the cloud growth. Oracle is much smaller than Microsoft, Amazon and Google in cloud “infrastructure,” the basic running of workloads. Its revenue for “IaaS,” the portion that is infrastructure “as a service,” was just a billion dollars last quarter. But, as analyst Phil Winslow of Credit Suisse noted on the call, the rate of growth of IaaS last quarter sped up from fifty-eight percent in the August quarter to fifty-nine percent this past quarter.
Asked how the company is speeding up, Ellison’s reply was partly boosterism — the continued move of computing to Oracle’s cloud from either on-premise or other clouds — but he also gave an interesting insight into how stuff is moving to cloud generally, including artificial intelligence.
“The workloads, AI and machine learning, is a huge – is exploding,” said Ellison. “Nvidia, the people who provide the GPUs for most AI workloads, they're moving a huge amount of stuff to the Oracle Cloud and a bunch of other companies that are doing that.”
Sifting what’s perhaps broadly interesting in all this, I would say it is a) companies continue to plow money into using cloud computing, so that they are shifting how they run their operations in spite of the fact the economic outlook is supposedly volatile; and b) Oracle itself is continuing to spend to build out its cloud operations.
Said Ellison, Oracle has forty “public cloud regions” around the world, and nine underway. He said the company will continue to spend on building such data centers.
“We are careful to pace our investments appropriately, but need to continue to build to meet our accelerating demand,” said Ellison.
That, I think, should be good news for networking firms, including Cisco Systems, Arista Networks and Juniper Networks.
Overall, I’d say the report suggests data centers should continue to be a relative area of health for tech for the foreseeable future.
China’s opening up to play havoc with chips, says Lynx
Dec 13, 2022
The Philadelphia Semiconductor Index’s three-month return. Its surge from mid-October has been among the best areas of tech stocks of late.
China’s pivot away from “COVID Zero” as its governing approach to the virus is leading to a surge in cases that will have a “dampening” effect on semiconductor production, opines chip observer RC Rajkumar of the boutique Lynx Equity Strategies in a note to clients Monday.
That might not bode well for semiconductor stocks, which have been among the best tech performers of U.S. issues recently, with the benchmark Philadelphia Semiconductor Index up almost a percent in the past month versus a 1.6% decline for the Nasdaq Composite Index. The Philly has risen despite continued negative data about inventory build-ups of chips and weakening tech spending.
China’s government last week eased off on the lockdowns imposed as part of its COVID Zero policy, and BloombergSunday reported that Covid is “rapidly spreading through Chinese households and offices.”
The supply chain, writes Rajkumar, “has been looking forward to upside as China eased up” on its stringent COVID measures, but, “overnight media reports warn of a dramatic increase in Covid cases across China,” including in Beijing, Shanghai, Shenzhen and Canton. Rajkumar doesn’t cite specific articles.
Rajkumar references “checks” suggesting that smartphone chip production, specifically, is already hampered, and The current unlocking corresponds with some of the heaviest travel time in China, writes Rajkumar, leading up to the Chinese New Year on February 1st.
“Smartphone component makers are planning for a slowdown, in anticipation of CMs [contract manufacturers] reducing capacity in CQ1 [calendar Q1 of 2023] as Covid-infected workers return after Chinese New Year travel,” writes Rajkumar.
“While the expected slowdown is likely not as bad as a government mandated shutdown, the supply chain is nevertheless planning for CM build plan under-shipping end demand.” That will probably hamper already strained supply of Apple’s iPhone 14, Rajkumar writes.
Rajkumar cites the example of what transpired in India before that country began widespread vaccination efforts, suggesting that COVID cases in China are “likely to explode to the upside in the near term.”
“Caseloads then hopefully comes down in 3-6 months as herd immunity kicks in,” while cautioning, “There is little evidence globally of caseloads dropping sharply on the basis on herd immunity alone.”
Silicon carbide investing: Susquehanna offers ST Micro as the reasonable alternative to Wolfspeed
Dec 13, 2022
Chip maker Wolfspeed has gotten a lot of attention from the Street as the “pure play” with respect to the relatively new chip technology silicon carbide. But Wolfspeed’s high valuation — eleven times this year’s expected revenue, based on enterprise value — has left some looking for less-expensive investing alternatives.
That’s the situation for Susquehanna Financial’s Christopher Rolland, who on Monday morning initiated coverage of Wolfspeed with a Neutral rating, instead preferring Analog Devices and STMicroelectronics, starting those stocks at “Positive.”
“While we recognize the company’s clear leadership today, shares trade at a premium valuation,” writes Rolland, “even when adjusting to the company’s ambitious long-term target model, implying near perfect execution.
"We therefore opportunistically await a more favorable set up,” he concludes.
Silicon carbide, of course, is a chip technology that is leading to greater driving range in electric vehicles, as I detailed in a piece in February. The technology is used most prominently in what’s called the “traction inverter,” a component in a car that sits between battery and motor and that converts direct current battery power to alternating current power to drive the motor. More effective conversion makes better use of the battery, and SiC, as it’s known, has that attribute.
Rolland sees pretty stunning growth for silicon carbide through the end of this decade, most but not all of that from automotive applications:
We forecast the overall SiC (devices and materials) TAM will reach $10 billion by 2030, a +17.6% CAGR through 2030. Likewise, expect automotive SiC revenue to grow at a +19.1% CAGR from 2022-2030, ultimately reaching a TAM of $8 billion. Rising EV penetration is a key driver for SiC growth, with the highest value in traction inverters as we believe ~65% will contain SiC MOSFETs by 2030, a stark increase from the current level of ~10% as OEMs shift from IGBT-based power products to SiC.
Wolfspeed is the leader, but Rolland’s enthusiasm is tempered by the company’s massive spending plans. As I reported in October, Wolfspeed’s CEO, Gregg Lowe, has told the Street the company needs to be able to cover six and a half billion dollars worth of capital expenses over the next several years to expand their SiC factories in the U.S.
While acknowledging Wolfspeed’s enormous head-start, Rolland writes that the big spending is a turn-off:
Wolfspeed has spent decades driving Silicon Carbide manufacturing to become the undisputed market leader in materials/wafers today. Furthermore, the company is quickly and successfully building its capabilities in finished semiconductor devices. However, we note these capabilities require capital at a cost that is dilutive to shareholders in the near term. Furthermore, competition is coming on fast, and risks of commoditization remain a possibility.
STMicro, writes Rolland, is just starting out in SiC, and SiC may be something that increases the company’s sales growth rate, he opines:
On top of STMicro’s core analog and power management business, the company is addressing new greenfield opportunities, including Silicon Carbide, Gallium Nitride, connected MCU, 3D Sensing and more. These new opportunities can allow the company to reaccelerate growth and expand margins beyond today’s conservative mid-40%s today. This reacceleration in growth, combined with margin expansion, could help drive a meaningful re-rating in the valuation multiple, which today remains in the bottom decibel of the industry. Initiating Positive with a $50 price target.
Analog Devices is not a way to play SiC, per se, it is just a great chip maker that has “opportunities around electric vehicles (BMS), communications (5G RF), and specialty analog (medical, instrumentation, aerospace),” he writes, that “should help maintain industry-leading margins and growth.”
Rolland doesn’t mention the other big SiC chip name, ON Semiconductor.
Shares of Wolfspeed this year are down twenty-six percent, while ST Micro is off twenty percent, ADI is flat for the year, and ON Semiconductor is up five percent.
If you want even more SiC-related names, check out the long table of stocks at the bottom of my February article.
The TL podcast for December 18th, 2022: Chip stocks break their winning streak, and what ails Tesla.
Dec 11, 2022
The week was a reprieve for software stocks, with GitLab, SumoLogic, MongoDB, C3.ai and DocuSign among names seeing big jumps in price, Alteryx is imagining a big role for its software going forward, ChatGPT captures the imagination, and subscriptions are coming to The Technology Letter.
Remember, Salesforce and VMware were young companies that rocketed out of the Great Recession
Dec 11, 2022
I’ve been thinking a lot lately about The Great Recession of 2008 to 2009, and one of my favorite activities with historical stock data is to look back at what happened to various tech names going into and coming out of that period.
The key thing that you should meditate on is that some great companies that came public just before the recession came out of that economic contraction stronger. People will tell you new technology suffers in a downturn. CIOs, they might say, won’t spend on new stuff, they’ll buy the essentials to which they are already committed.
Don’t be mislead by that conventional wisdom. Important technology thrives and ultimately triumphs through a downturn.
I’d cite two important examples in particular that really make the case for new technology. This is not cherry-picking, though it is selective.
VMware came public moments before the recession, on September 28th of 2007. The officials start of The Great Recession, according to the National Bureau of Economic Research, the body in the U.S. charged with setting these landmarks, was January of 2008, following the peak in activity of the preceding economic expansion in December.
A few years before VMware and the Recession, Salesforce came public in July of 2004.
Both companies were not just young public companies in January of 2008, they represented relatively new technology waves at the time.
There was nothing like VMware as a company when it came public, even though the company by then had been shipping product for nine years. The only comparable software efforts that emerged at the time were from inside of public companies whose main software business had nothing to do with the “virtualization” that VMware spearheaded. In 2008, Microsoft was VMware’s biggest competitor.
To some extent, Citrix Systems had some overlapping offerings. Later, Intel added some virtualization to the chips it sold. Versions of virtualization were already emerging from the open-source software community, such as Xen, but it took a few years still for the biggest of them, such as OpenStack, to emerge. Follow-on works such as the Kubernetes container management software, didn’t emerge until 2015. And Nutanix, the closest comparable as a company, wasn’t founded until 2009.
So, September of 2007 was a time when it was still early in VMware’s influence. Mind you, VMware had already hit $1.3 billion in revenue in 2007, so it was an established company in that respect, even though it was still early in its mission.
Like VMware, Salesforce in 2008 had established itself as a successful company, but it hadn’t yet changed the landscape. For the fiscal year ending January of 2008, it had racked up $750 million in sales.
Salesforce was the poster child for cloud computing for years because there weren’t any other “Saas” — software as a service — companies back then. Its biggest competitor, Workday, which had been founded a year after Salesforce’s IPO, didn’t come public till 2012. And large, entrenched software vendors in 2008 were still largely disparaging of Salesforce in public. It took Larry Ellison of Oracle years more to get with the SaaS thing.
Microsoft’s Satya Nadella would not take over from Steve Ballmer until years later, 2013, which is when Microsoft’s SaaS engine really started to hum.
Consider that when venture capitalist Marc Andreessen wrote-his OpEd for The Wall Street Journal in 2011, two years after the recession, about how “software is eating the world,” he referred to Salesforce as a new kind of software giving competition to older, established software vendors Oracle and Microsoft.
Here is where it gets interesting. Both stocks had deep declines during most of 2008, with VMware dropping sixty-eight percent between January of 2008 and June of 2009, the entirety of the eighteen-month contraction. That was the total damage. And, it dropped a staggering seventy-nine percent from January of 2008 till it finally bottomed on December 1st of that year.
I’ve laid out the course of events in a chart of the two for the whole eighteen months, comparing the two to the broader market:
Similarly, Salesforce had to lose thirty-nine percent from January, 2008 to June of 2009, and from January of 2008 till its bottom on November 19th, it lost sixty-five percent.
However, both stocks saw solid gains from their lowest points. From its bottom in December, VMware went on to notch a fifty-two percent gain in the subsequent seven months of the Recession, and Salesforce saw forty-two percent upside in the ensuing eight months.
Both of these stocks — and this was characteristic of many tech names — bottomed way before the broader market, which didn’t bottom until March 9th of 2009, in the case of The Nasdaq Composite Index and the Standard & Poor’s 500 index.
More important, both companies grewfinancially during the recession. VMware saw an amazing forty-two percent growth during 2008. Even though its growth was drastically reduced in 2009, at almost eight percent it was still growth. The following year, 2010, VMware rebounded sharply, with sales growth rising forty-one percent.
More remarkable, Salesforce saw an amazing forty-four percent increase in sales in the fiscal year that roughly corresponded to 2008. It then went on to twenty-one percent during 2009. After that year, growth picked up to the mid-thirties on a percent basis.
Both of these companies became richer companies, too. Free cash flow for VMware increased a staggering forty-five percent during 2009, to almost nine hundred million dollars that year, real cash profits. Salesforce’s free cash flow increased a slower but still very respectable twenty-eight percent in 2009.
So, what does all this tell you? Good companies with a hand in important technologies that are changing the landscape continued to sell even more stuff during the worst economic contraction the U.S. had seen since World War II. The growth rates were no doubt diminished by what was going on, but sales did not collapse. The companies became more rich on a real cash basis, and their shares bottomed ahead of the broader market and saw healthy gains months before the Recession was over.
Not a bad outlook for tech stock investing. My recollection of covering the market during those later months of 2008 is that everyone was still staggering, dazed and confused by the collapse of Lehman Brothers and the near-collapse of Merrill Lynch, and the folding of retail banking operations such as Washington Mutual. Very few people could see their way to a market rebound in March of 2009, and few were seeing what was already a turn in a couple of great stocks.
Now, the mind turns naturally to the question of which companies today might be in a similar position. Off the top of my head, I would say that software companies at a similar scale and similar rates of growth seem like parallels. In that regard, Snowflake, one of the TL20stocks to consider, is a highly successful cloud software company that just reported nearly two billion dollars in revenue in the twelve months ended in October, similar to the scale of VMware back in 2008. And it has some small free cash flow, similar to Salesforce and VMware.
In a similar vein, Confluent is a software vendor of important infrastructure that has just notched a little over half a billion dollars in trailing twelve-month revenue, though it won’t be profitable for another couple of years.
If you want to know how those two stocks have fared this year, relative to the seventy-nine and sixty-five percent declines of VMware and Salesforce, as a rough approximation, Snowflake is down fifty-seven percent this year, and Confluent is down seventy-two percent. You could say that the declines of both stocks — again, without any clear indication recession has happened or will happen — approaches the kinds of declines that their predecessors endured, allowing, of course, for the fact that Snowflake and Confluent were declining this year from very extended, if you will, valuations, which makes the comparison difficult.
I don’t have a crystal ball to tell you those are the two winners. And I suspect there are several other very good candidates, maybe not all of them as far along in revenue. The bigger takeaway is, companies with important technology and market momentum tend to come out alright on the other side of a financial contraction, and if their technology is really meaningful, it even changes the landscape.
A letter from the editor
Dec 10, 2022
The Technology Letter editor Tiernan Ray discusses the forthcoming TL subscriptions.
I conceived of The Technology Letter in August of 2020 as a way to move past media’s fixation with just a handful of companies, Apple, Alphabet, Amazon, Meta, Tesla, and talk about a lot of other things investors care about.
In order to endure, The Technology Letter must become a profitable business. Later this month, subscriptions will go live.
For $30 a month, you’ll get access to the entire site’s content. Readers who already receive the free email newsletter will get an automatic discount to $20 a month.
I have a lot planned for The Technology Letter in the way of future features, and your patronage will be key to that.
In addition to leaving any thoughts about the plan in the comments below, feel free to email me at tiernan@thetechnologyletter.com.
Alteryx CEO Anderson: the platform is next
Dec 08, 2022
Anderson in the company’s Manhattan satellite office. The biggest companies, he says, use lots of cloud services. “They can't put all their eggs in one basket.” He believes Alteryx can be an “orchestration or automation layer” across cloud services and on-premise. “We think there's permission that exists for us to build that platform as an independent company.”
Mark Anderson has spent two years “re-tooling” software vendor Alteryx, as he puts it, since he came aboard in late 2020, to be better at going after the biggest customers.
He worked from a playbook honed at prior tech companies, such as Palo Alto Networks, where he was president for four years, and other big firms.
“The majority of the C-Suite is new, the majority of their teams are new,” says Anderson of Alteryx now, “and the people that were here prior to me coming on board, they've really worked hard to get evolved and enabled and trained for what customers need today and tomorrow.”
As Alteryx approaches what the Street expects will be a billion dollars in revenue next year, what comes next?
The “platform” is what comes next, a surprisingly ambitious bid to revamp the company’s offerings to be a much more comprehensive suite of software programs.
I sat down with Anderson this week at the New York satellite office of Alteryx in midtown Manhattan as he took a breather from customer meetings. Alteryx is based in Irvine, California.
It was two years ago that Anderson set out to change Alteryx, which he told me at the time needed new blood with a different pedigree, people who, like himself, had taken companies to multiple billions in annual revenue, people with “stage experience.”
The re-tooling of the team is working great for Alteryx’s financials.
After repeatedly missing expectations in 2020, the company in late 2021 began a very hot streak under Anderson. The third quarter reported last month was the fourth quarter in a row of revenue upside. And while some software makers are trimming their outlook because selling is getting harder, Alteryx raised its revenue forecast for the year for the third time in a row.
What happens now is that the product itself needs to grow up. The many programs that make up the company’s offering, lead by its flagship app, Designer, are tools to let employees stitch together various data sources throughout an organization and to run analytic operations on them. It’s a window into operations, to see how sales have been trending, say, or pinpoint where future leads are going to come from, or how the new manufacturing target should be set.
In one of the many instances of what he calls “pattern recognition,” Anderson sees a shift in Alteryx customers. Before taking over the CEO role in October of 2020, Anderson had already been a director at the company for over two years.
“I used to talk to CFOs, just as a board member, before the pandemic, and they would say, ‘We'll get to our transformation when we're good and ready’,” he reflects. “And now, it's like they can't get there fast enough.”
“The pandemic, the ensuing supply chain nightmares, the rampant inflation and talk of recession — companies need help seeing around corners,” he says. They need more of the analytics functions, in other words, with a new urgency.
Giant customers, he says, are now saying they’re “digital” companies. “I’ve heard the CEO of Capital One say, ‘We’re a technology company that moves money around,’ right?” Another customer, Coca-Cola, “is a technology company that facilitates performance experiences for their consumers.”
Those companies need to run more analysis in order to make those transformations, but they can’t make their data scientists be both scientists and marketing and sales experts, he notes. “For one thing, there just aren’t enough of them” given data scientist is a high-price-tag role.
And so, “you need to transform the functional knowledge workers with solutions that automate their output,” he says. The tool should be taking people in sales, in marketing, in product development — all over the organization — and make it so their work turns their expertise into valuable analytic insight.
“If you're a supply-chain person or an analyst in FP&A [financial planning and analysis], we want to make you automated and great at what you do,” is the pitch. “We don't need to send you to school for five years to teach you how to use Alteryx, you can do it in a day.”
At the same time as re-skilling is necessary, he says, some giant companies are in crisis mode in a digital age.
“I've had three customer meetings already this morning,” he tells me. “One just left, a large insurance company, they have thousands of knowledge workers that are still working primarily with spreadsheets.
“They're closing the books of one of the most famous insurance companies in the world — it’s really in the U.S — with spreadsheets and manual work.”
This re-skilling, incidentally, is a mission of the company, he says. “We're advocating for an up-skilling of the world,” he tells me. “We're donating licenses to universities and polytechnics all around the world” to use the software for free.
It is also a belief, I learn, that runs deep with Anderson. “I was born in the slums of West Belfast,” he later tells me. “And my dad got an education, and we got out of Belfast as fast as we could.
“So, you know, education is super-important.”
And so, the immediate opportunity is to have companies use more of the Alteryx software on a daily basis, more broadly throughout the organization.
The first way to do that is to take a product that is used mostly on premise, meaning, inside a company’s own data center, and move it to public cloud computing facilities — to make it a cloud app, in other words.
Alteryx is one of those rare birds, a software maker that hasn’t yet transitioned to being a cloud computing vendor.
Because the Alteryx program ingests customers’ own data, cloud traditionally didn’t matter because the data was going to stay mostly on-premise. “Today, ninety percent of our customers’ data is still on-prem,” notes Anderson.
However, the logic to offering customers public cloud computing, rather than just a Microsoft Windows version of Alteryx, is to spread usage throughout enterprise.
“With easier access” to the program, he says, "I can sell to more users, and if I sell to more users, I can sell to different personas” within a company.
That starts to lead to economies of scale for customers, says Anderson. “If I do more personas and more users, I can get the unit cost down for people dramatically,” he explains. With greater economics, “I’m not going to charge you the same that I charged you for the first hundred users that I will for the next thousand or the next two thousand.”
Hence, the move to the cloud is now underway.
“What we did in a hurry was really start to accelerate the cloud agenda,” says Anderson. “So, bringing in the right people in product and engineering that have seen the movie before, that have that pattern recognition.”
The move to accelerate cloud has been lead by acquisitions of small, young firms that have brought new capabilities to Alteryx.
Hyper Anna of New South Wales, Australia, a purchase last year for undisclosed terms, lead to one new cloud product, Auto Insights. That tool lets a business manager avoid analysis per se and instead have the system tell them where in the data to dig more deeply for potential insights.
The second cloud product, Alteryx Machine Learning, offers some “basically, pre-built machine learning models that Alteryx Designer users can really go get to without becoming a data scientist,” says Anderson. Machine learning is a popular form of artificial intelligence, and so it is one capability that many companies increasingly would like to work with.
Also last year, the company spent four hundred million to buy San Francisco-based startup Trifacta, which had built an analytics program of its own that already was running as a cloud service.
The Trifacta code has been a key addition to propel the flagship Designer program’s move to a cloud version, Designer Cloud.
"That was our re-platforming option,” he says of Trifacta. “It took what would otherwise be a five-year journey to basically rewrite our software in all three public cloud environments globally, and would have required more engineers than we have.”
Designer Cloud was first introduced in Amazon AWS in a simple version last year, before the Trifacta deal. Since the deal, Alteryx has been “knitting together” the desktop Designer features to the Trifacta infrastructure, he says. “We have a bunch of early adopters giving us feedback,” and the new Designer Cloud will go live in January.
In revamping Designer for the cloud, Alteryx is placing a big emphasis on governance functions. Governance wasn’t as important on-premise because IT was able to closely control access to date. Governance becomes a more sensitive matter in cloud.
“You start making your innovation available in a public cloud, and you’re now starting to pull meta-data out of their environments, you’ve got to have governance and security that’s rock solid.”
“We over-rotated on innovation, and, in retrospect, probably under-rotated on governance and security,” says Anderson of the legacy Alteryx code.
Trifacta, he notes, already had worked on governance factors because the startup was cloud-first, as they say. At the same time, Alteryx itself has “chopped a lot of wood on governance” in the past two years, he says.
It will take time for the cloud versions to be an exact match with the on-prem versions, says Anderson, rather like how Microsoft’s Office365 had to evolve.
“There will be a point where people will go, wow, this is identical” to the desktop versions, he says of Designer Cloud.
Alteryx isn’t yet disclosing the amount of its cloud revenue. Back in April, CFO Kevin Rubin told the Street Trifacta might produce total contracts, or “annualized recurring revenue,” worth twenty million by the end of this year. But ARR is a non-GAAP metric, it is not the same as reported revenue. Rubin has said there is likely to be “limited revenue contribution this year” because of the timing of deferred revenue from the Trifacta acquisition.
Anderson says of future cloud revenue, simply, “I think it ramps.” As an early sign, Anderson told the Street last month the company signed two million-dollar-plus deals with customers for Designer Cloud.
A second avenue where Anderson has moved the company, in order to spread the use of Alteryx, is what are known as “enterprise license agreements,” or ELAs, a manner of pricing for usage that is “way more flexible.”
The ELA is “not rocket science,” says Anderson, but one of the many tried and true ways to sell to large enterprises that he saw in his time at Palo Alto and other shops.
“If you think about this in the old software contract, when you get to the one-thousand license threshold, and you go over, you're in trouble, we’re going to wrap your knuckles,” he explains, referring to how vendors limit the number of users in a software contract.
“What our software does for businesses and for people matters so much, especially now of all times, we want them to go faster,” he says. So the ELA lets a customer “burst,” to temporarily extend the product to more than they’ve contracted for. A thousand-seat license would now burst to one thousand five hundred, without the customer having to spend more.
The beauty of bursting comes when it’s time for a customer to renew their contract at the end of a year. “When it comes time for them to renew, we say, You’ve been using 1,500 for the last six months, do you want to pay for 1500? Because if you do, we'll let you go to 2,250 — you know, fifty percent more.”
That can prompt more usage. “We say, Congratulations on transforming manufacturing as well as finance, let's get into supply chain.”
Already, says Anderson, forty percent of the ELAs Alteryx has sold are in burst mode.
This is a savvy way of finding opportunity by not being chiseling. “Maybe they don't have the budget for it, but the sense of urgency for deploying it to more people is there,” he says of his customers.
“I've done this before, in a previous life, when we were doing virtual firewalls” at Palo Alto and F5 and Cisco, he says. “I did the exact same thing,” meaning, extending usage without gouging customers. “And it was like throwing gasoline on a fire because you accelerate adoption.
“People aren't doing this to run a worse business or to, you know, run a looser ship,” says Anderson of the need to throw more people at analysis. “They're doing this because they need to transform, and time is your enemy when you're transforming.”
Both cloud and ELAs are meant to bring more and more users in a company onboard and prompt a virtuous cycle economically.
“The more we can make it easy for people to adopt our technology, the quicker the journey, I think the more successful we're going to be, the more users we will have, the easier it will be for us to get unit costs down even further.”
What comes next after simply spreading usage is to be “more relevant,” he says.
What does that mean? “This industry has been its own worst enemy,” says Anderson of the data analytics software field. “There are a few legacy platforms that have not transitioned to be more modern.”
Companies such as Informatica, he says, and SAS, very prominent vendors of data management, “really haven't modernized.”
"I've been in tech for thirty-five years, and platforms win in every other area like operating systems, public cloud, security,” he observes. “Platforms win because customers want fewer vendors, they want less complexity, they want more automation, they want fewer necks to choke as one customer would tell me.”
“The more we can make it easy for people to adopt our technology, the quicker the journey, I think the more successful we're going to be, the more users we will have, the easier it will be for us to get unit costs down even further.”
Even as competing vendors such as Informatica and SAS have been static, he says, the cloud service providers have been trying to build their own mass-market cloud analytics platforms.
“The hyper-scalers have made acquisitions,” he says, using the sobriquet for giant cloud providers. “You know, Salesforce buys Tableau, Google buys Looker, Microsoft buys PowerBI — they’re going to take mid-market and below because they're the people that will put all their eggs in one basket.”
But the enterprises, says Anderson, the customers that are his focus, want diversity, and they want more to pick and choose.The new vision is for Alteryx to be a central tool for companies to move data and move analysis between different databases and cloud platform.
"This big insurance company that was here, they've got a lot of stuff in AWS, a lot of stuff in [Microsoft] Azure,” he notes. “They even started putting stuff in Google Cloud.” However, “they can't put all their eggs in one basket,” he cautions. “We want to be that orchestration or automation layer across the enterprise for analytics” to help the largest firms jockey across clouds.
“We think there's permission that exists for us to build that platform as an independent company,” says Anderson.
What Anderson is describing as an orchestration layer is not unlike the “trans-cloud” opportunity that I have mentioned recently at software makers such as SumoLogic and Nutanix. For a vendor of desktop analytics, it is a rather substantial step beyond an on-prem product built on a discrete set of data sources.
It sounds like quite a substantial engineering project, and M&A project, I point out.
“For sure,” replies Anderson. “You know, in two years we’ve done three acquisitions, and they haven’t been giant acquisitions, but they’ve given us a modern engineering team,” he observes.
That team is more capable, he says. “That’s really a global team — a lot more people in India, a lot more people in Eastern Europe — a modern engineering team that can crank out innovation at a much more accelerated pace.”
What’s more, Alteryx, he notes, has formed a venture capital investment unit to prospect among startups. “We want a front-row seat to the smartest people and the most innovative tech that's bubbling up today that might be of interest to us three, four, five years down the road,” says Anderson.
The platform he is describing, says Anderson, starts to bring more customers to the bleeding edge of analysis.
“You know, McLaren, for example,” the British car maker, a customer of Alteryx’s, “they use us to simulate three hundred million potential outcomes of a race before a race even starts, and every race, they move around one-point-five terabytes of data.
“They’ve got fifty analysts back in England, pounding away on keyboards, many of them using Alteryx,” he adds.
“I think the thing that amazes me, is, every company wants to be a McLaren, it doesn't really matter from building cars and watching them race around the track, or Stanley Black & Decker building tools — they all want to run as efficient a supply chain as possible."
The platform Anderson is describing — some of which already exists in the company’s product suite — goes beyond the traditional use case of Alteryx. It is no longer just to let employees assemble analysis but also to take over tasks in data preparation across a company. That includes things such as extract, transform and load,” or ETL.
Is this platform gambit biting off too much? I ask Anderson.
Not really, in his view. “Well, I mean, our engine as it exists today allows people to pull data from infinite sources and infinite formats, whether structured [data], unstructured, and transform it,” he points out. “That’s what the software at its core does.” His point is that could, in a sense, be the description of ETL.
What about Microsoft and the rest? Will they try to crush Alteryx?
“We often get compared to PowerBI at mid-market accounts because Microsoft will give that away with enterprise license agreements,” says Anderson. “And, you know, I've competed against free for the better part of the last fifteen years, you know, free firewalls — free security if you buy our networking with Cisco [Systems].”
“I love having that conversation about how the headline is going to look when you accepted free security,” says Anderson with a chuckle. “And I think, you know, analytics is right up there as well in terms of sense of urgency but also importance for the business.”
All these product angles — the move to cloud, the ELAs, the engineering of a platform — are “just waypoints in the journey, and how quickly it leads to two billion and three billion” in annual revenue, says Anderson.
Alteryx is, again, cruising toward a billion dollars next year, he reminds me. “Based on how well we do with customers at that scale, the next year will dictate you how quickly we get to double that or triple that,” says Anderson.
He has seen this movie, he reminds me. “Between Palo Alto Networks and before that, F5 — both of them grew fifteen to twenty-five-X in seven, eight years,” he recalls, alluding to revenue. “It's doable.”
What comes along with scale, he notes, is leverage.
“[CFO] Kevin [Rubin] loves the fact that, you know, we're looking at our business and looking to get better leverage out of it,” he says. “Because, you know, these days shareholders care a lot more about free cash flow margins, they care about operating margins and profitability.
"And of course, as we prepare for fiscal year ’23, we're taking all that into consideration into how we're planning for this next year.”
With the expenses to re-tool Alteryx, and to acquire technology, the company has been only modestly profitable of late. The forecast offered last month is for the company to end the year with a non-GAAP profit in a range of negative five million dollars to breakeven, and a non-GAAP net loss per share of thirty-two to thirty-seven cents a share.
Free cash flow this year is estimated at negative eighty million dollars.
But a view to profit is on the horizon, suggests Anderson.
As any company gets bigger, “As a percentage of your top line, you spend less, and that's how you start generating free cash flow and positive operating margins,” he points out.
“And, you know, in my experience, it happens right around the stage when we're close to a billion dollars.
Alteryx will next report financial results in February, he says. “We'll give guidance for at least our first fiscal quarter, that’ll be a stake in the ground on year-on-year prediction of the first part of fiscal ’23,” he says. “And, hopefully, the path to free cash flow shows up there — that's the objective.
As sincere and serious as all this re-tooling is, I’m reminded, as I shake hands and part with Anderson, of how he characterized things when first we spoke last year.
“Going from this stage to two or three billion dollars,” Anderson had assured me, “there’s tons of really fun experiences requiring tremendous focus on the right things.”
As I make for the elevator, a gong is heard. It’s Anderson, helping to bang the gong with a salesperson who’s being celebrated for having just landed a big deal. Anderson gives the team member an affectionate demi-hug, and the gaggle of team members who’ve clustered in the common area laugh with delight, clapping and whooping their applause.
At a recent $42.97, Alteryx shares are down twenty-nine percent this year.
Software scorecard: Unity soared, dLocal tanked in a tough season
Dec 06, 2022
The biggest money maker in software stocks this earnings season has been video game authoring tools company Unity Software, based on its seventy-seven percent rise in stock price since it reported on November 9th. The biggest dog so far is payments processor dLocal.
The earnings season isn’t quite over, but after seeing reports from a hundred and twenty-five names in the past two months or so, I thought it might be instructive to see how things have done.
The latest positive earnings surprises, Monday evening, are GitLab, makers of versioning control systems for programmers, and Sumo Logic, makers of DevSecOps platform software and tools, both of which saw their shares rise sharply after-hours after reporting better-than-expected revenue and earnings and a better-than-expected forecast.
What I’m after, however, is not just the after-hours “pops” of the stock price this earnings season, but also how the software names have done in the days and weeks following.
And so the table at the bottom of this post includes the stock jump right after earnings in one column, but also the cumulative return of the share price since the day of the report. The table is sorted by the latter metric, the cumulative return, from most positive down to most negative.
A fundamental thing to take away are the reversals. Companies that had a big pop on their report have in many cases notched big declines subsequently, such as legal software maker CS Disco. The opposite is true too: some companies sold off big but recouped losses and even rose handsomely, such as help-desk software maker Five9.
I wouldn’t say there’s any deep reason for those reversals. They are merely a lesson not to take the pop from the report too seriously. After the smoke clears, some people find value in beaten-down names, while the enthusiasm for earnings reports for some names fades upon more careful reflection.
Another fundamental thing to take away is that companies that did everything right, such as Alteryx and Confluent, nevertheless sold off subsequently, whereas companies that blew it as far as reported results, such as Telos or Shift4Payments, were able to bounce back in the days following.
As you can see from the average return in the footer row of the table, the next-day pop for these stocks hasn’t been too good, just one percent, on average. The average return from report date to today as been as well, also just one percent.
It’s no surprise why software’s having a tough season. As I’ve chronicled in the past two months, many software names are reporting what they call “deal push-outs,” more time required to sell software, more scrutiny. It’s been harder and harder to make a sale as companies tighten their belts.
I’ve tried to weed out companies that are software to an extent but that include too much of a focus on content, such as online learning firm Coursera, which I dropped, or that are really making money by sales of items even if they regard themselves as a software company, such as luxury goods marketplace Farfetch. If you come across names you think do not belong in this group, please point them out.
I’ve included the most recent quarter’s revenue, as well, so that you can get a sense of each company’s scale. While there have been winners and losers of all sizes, I will point out that the average quarterly revenue of all the companies whose stocks have declined since their reports is $407 million. The average quarterly revenue of all the companies whose shares have stayed flat or risen since their reports is three times as large, $1.25 billion. So, on average, bigger companies have seen their shares hold up better than smaller companies.
The TL podcast for December 4th: Diving deep into Qualcomm, and the ominous outlook for AI press releases
Dec 05, 2022
nteresting time for earnings from younger companies, including CrowdStrike’s (CRWD) disappointing report, but PureStorage (PSTG) was a bright spot, and diving deep into everything that’s going on with Qualcomm (QCOM), plus, beware next year’s flood of absolutely awful “generative” AI nonsense.
Qualcomm CFO Palkhiwala: We are becoming a different company
Dec 05, 2022
Some technology companies have gone through “transformations” where you had to really squint to see what, if anything, had changed.
When Microsoft and Oracle, for example, were early in their respective transitions to being cloud companies from being plain-old software vendors, you had to have a kind of monastic devotion to the scholarship of their quarterly filings to figure out what the heck was going on.
In contrast, chip giant Qualcomm has been going through a transformation for several years that is written in neon lights.
The company has expanded its sources of revenue dramatically, especially in chips for automobiles; and it has expanded its chip operating profit margin by a stunning seventeen percentage points in just two years.
At a gathering in September for analysts in New York City, the company’s CFO, Akash Palkhiwala, told analysts the company is well ahead of plan on signing up tens of billions of dollars worth of future revenue from car makers, lifting Qualcomm’s multi-year goals.
Qualcomm’s stock, however, doesn’t seem to reflect much if anything of that success. The shares, down about twenty-nine percent this year at a recent $125.66, track slightly below the benchmark Philadelphia Semiconductor Index, the SOX, in the past twelve months.
The valuation, moreover, as a multiple of enterprise value divided by projected next twelve months’ sales, at 3.7 times, hovers just slightly above where the stock was five years ago, when Qualcomm was a much less diverse company.
It is also toward the low end of valuations of the SOX companies, and well below the SOX average of 5.8 times.
Qualcomm’s stock is one of the inaugural picks of the TL20group of stock to consider. Given that I think its shares are a good buy, I was in the mood for a good chat about why the market might not be fully valuing Qualcomm.
And so, when Palkhiwala sat down to chat with me via Zoom this past week, one of my main questions for him was, At what point will investors get the message that the company has changed?
“Yeah, that’s obviously a very fair question, and top of mind for investors,” says Palkhiwala. Since the company’s analyst day meeting a year ago, he says, when the new approach at Qualcomm was formally laid out, “the amount of progress we’ve made within the year is very significant.”
In truth, the past year has been the culmination of a multi-year journey for Qualcomm to move away from its historical reliance on selling smartphone CPUs and modems for the vast majority of its revenue.
Under former CEO Steve Mollenkopf, the company set an agenda to expand into the realm of wireless chips it didn’t control, the “radio frequency” filters that navigate the airwaves. That product line is now over four billion dollars a year, bigger than the two “pure plays,” Qorvo and Skyworks Solutions.
A plan was nascent under Mollenkopf to spread the Qualcomm chip expertise more deeply into the automotive market, to equip cars with wireless and media processing; and into the Internet of Things, a grab-bag made up of all kinds of things including chips for Windows PCs and chips for Meta’s Oculus gaming gear.
Both of those markets received a greater emphasis as Cristiano Amon, formerly president, took the reins from Mollenkopf last year as CEO. For the first time, Qualcomm regularly broke out the revenue from both autos and IoT on a quarterly basis. Autos and IoT have grown from six billon dollars in revenue two years ago to over eight billion in the fiscal year that ended in September, thirty-eight percent growth.
More important, autos and IoT now make up twenty-two percent of chip sales that were once almost entirely smartphone-chip sales.
“When you step back and look at the long term strategy, it's playing out exactly as we had hoped for,” says Palkhiwala.
“Cristiano became CEO a year and a half ago, and we came up with this new approach on how we're going to lookgoingforward as a company, effectively transitioning from a connected smartphone company to a processor company that powers the connected, intelligent edge.
“It becomes a different company with that, right?” he says. “It's about diversification. It's about expanding the technology portfolio that we have, and bringing it to auto and IoT. And, as you saw from our numbers and our guidance, we're very much on track in those areas.”
SKATING TO THE GOAL
The financial goals as articulated by Palkhiwala have been fairly straightforward.
As he laid out a year ago at the company’s analyst day in New York, over a three-year period, fiscal 2022 through 2024, the company would increase its revenue from chip sales, its division called Qualcomm CDMA Technologies, QCT, by “mid-teens” on a percentage basis, compounded annually. It would keep its patent licensing business, Qualcomm Technology Licensing, QTL, which makes up the other fifteen percent or so of revenue, at the same size and margin going forward.
And non-GAAP expenses would be kept in a tight range of twenty-one percent of revenue to twenty-three percent while operating profit margin for the chipset business would be maintained at thirty percent or better.
The company is already off to a good start. Its revenue outlook for its automobile business for the next several years is now higher than originally forecast. And operating expenses this fiscal year were just under twenty percent, while operating profit in QCT, known as earnings before taxation, “EBT,” was thirty-four percent for the year.
But the Street is not entirely buying it. Not only does the valuation multiple appear low, the Street consensus for revenue in the fiscal year ending in 2024, the end of that three-year range, is for $40.39 billion. That implies a compounded growth rate of fourteen percent, which is a little lower than would be suggested by “mid-teens.”
Hence, as I suggest to Palkhiwala, investors are still grappling with something in the story.
It is not that Palkhiwala isn’t an excellent messenger for the Qualcomm story, because he most certainly brings unique talent and experience essential to the task. Since he took the role of CFO in August of 2019, he has cultivated an excellent reputation on the Street as a reliable forecaster of Qualcomm’s financials, and a straight-shooter.
He knows the company inside and out. The rhetoric about transformation is backed up by his own career reflections, having started at Qualcomm twenty-one years ago.
“The talent in the company, I think, has expanded a lot” in that time, he tells me. “We used to be very much focused on one set of technologies, and now, as we've grown up as a company, and we’re going into different markets, it really is in some ways a very different company than we used to be.”
The continuity, he says, is in the fact that “we’ve, kind-of, retained the soul of the company and the technology that we created,” that portfolio of chips now being moved into automative and IoT uses.
An affable manner probably helps ingratiate Palkhiwala with the Street. His self-effacing humor can be disarming. When we met for the first time in September, at the auto event, and I asked him for an interview for The Technology Letter, I told Palkhiwala I’d interviewed Amon many times in past, and Mollenkopf before him. “So, now you’re ready for the consolation prize,” quipped Palkhiwala.
No runner-up, actually, Palkhiwala, an engineer by training, often frames his financial talks by tying them back smoothly to the technology vision laid out by Amon.
ON THE EDGE OF SOMETHING BIG
Investors, says Palkhiwala, understand the company’s position as a chip supplier to the “connected edge.”
In a nutshell, as more and more of computing has been sucked into centralized data centers of cloud computing services such as Amazon’s AWS, a second wave is happening where those servers have to connect to devices of all sorts at the far corners of public and private networks.
The devices might be smartphones, yes, but increasingly they are other kinds of things: point-of-sale smart terminals hooked up to networks; your Internet-equipped car; heavy equipment with wireless capabilities to send status information back to monitoring dashboards.
“We have the ability to be the other side of the cloud, right?” says Palkhiwala. “So, we would be the device on the edge that is connecting to the cloud and we're doing an activity of processing artificial intelligence at the edge.
“That message and that understanding of the opportunity in front of us resonates with investors.”
The technology trends that matter, he argues, are propelled by economic tribulation.
“We see this digital transformation driving this tremendous change in these end-markets that is very much in our favor, and that brings those end-markets closer to the technology that we already have,” he says.
“Those trends,” he says, “are, if anything, accelerating as we hit some of these economic challenges,” meaning, the use of automation or other resource-saving technology measures.
"Because if you are a retail company, if you are a manufacturing company, the use of technology becomes even more important,” and “digital transformation becomes a tailwind in a challenging macroeconomic environment, and that's something that we expect to benefit from as we look out the next several years.”
Included in that opportunity is the prospect of re-using decades of Qualcomm’s amassed intellectual property in new markets. “The key assets that we have as a company is maybe the broadest technology portfolio of any semiconductor company out there,” says Palkhiwala.
“We created this portfolio for our handsets, now we can leverage it into automotive and IoT,” the newer areas, “and there'll be more opportunities in the future because it's a portfolio of technologies that is extremely relevant to every end-market in the semiconductor industry.”
With investors, Palkhiwala has emphasized that the leveraging of that portfolio has produced financial leverage, too, most immediately in the surge in QCT’s profit margin from seventeen percent of revenue two years ago to thirty-four percent most recently.
“Since we are reusing the technology we created for mobile, it allows us to grow revenue in a very profitable manner,” he says. “That's where the operating leverage of the business comes into play: if you look at our actuals for the last couple of years, as we've grown revenues, we've been able to expand our operating margins — and that's a framework that I think sits well with our investor base.”
If the big picture is promising, and the financial goals are clear, and the messenger is an exceptionally talented one, what, pray tell, is holding back the Street’s fully valuing this stock?
SHORT-TERM ANXIETIES
There are at least a few things at the moment that are affecting investors’ ability to get their arms around “the New Qualcomm,” if you will. Our wide-ranging talk covered pretty much all of them.
Most immediate is that, as always, Qualcomm’s business still is very much impacted by what happens in smartphones, and smartphones have been a dog this year.
That has lead to last month’s downward revision in Qualcomm’s estimate of the market, and a disappointing revenue outlook for this quarter.
“In the shorter term, the semiconductor industry is going through some significant challenges,” Palkhiwala observes when I ask what is top of mind in the company’s fourth-quarter earnings report last month. “One, the inventory build that has happened in the channel,” he says, meaning, chips that now have to be first used up by phone makers to clear their stocks as sales come in lower than expected. “And, second, is just, kind-of, the market weakness we're seeing because of macro-economic conditions.”
The macro-economic picture for Qualcomm is construed by investors very specifically: Since China became a huge market for Qualcomm smartphone chips some years ago, China has featured heavily in results as boon or bane.
It is certainly a bane at the moment. Handset sales among China vendors using Qualcomm chips, companies such as Xiaomi, Oppo, Vivo, are down by over twenty percent this year in unit terms, according to some market data.
Palkhiwala is inclined to play down China’s uniqueness, however. “Clearly, the restrictions that are in place has an impact on total handset sales,” he says, referring to the country’s strict COVID-19 measures. “But there are cyclical things that are impacting the entire industry,” he adds.
“You're definitely seeing a post-COVID shift of spend from goods to services for the consumer” globally, says Palkhiwala. “Rather than buy a phone, someone is going on a vacation, sitting in a hotel, having a meeting in a restaurant.”
“That is obviously a temporary thing that'll happen.” When will China return to something more like normal? “I’ll stay away from speculating on it, there are so many factors that play into it that are beyond my understanding,” says Palkhiwala. “I will say, when that happens, we’ll be ready to take advantage of it.”
If he won’t predict the China market rebound, investors would love for Palkhiwala to discuss how the rest of the handset market may offset China’s slump. While China handset sales are down by twenty percent or more, Palkhiwala’s comment to the Street last month was that the total global smartphone market is declining by “low double digits,” implying there are bright spots elsewhere.
“Generally, the U.S. has obviously continued to be very strong,” says Palkhiwala in response to the question of what markets offset China. “Emerging markets are going to have their own trend of transitioning from 4G to 5G” wireless networks, the latest speed bump in connectivity, he adds.
“India is a great example where they're at the front end of a rapid shift from 4G to 5G, and so that’s going to be a positive development for us in the next year or so.”
He is inclined, though, to view smartphones “on a macro, global basis, rather than each individual market.”
That global handset market is not always well understood by investors. It may be more robust than they think, suggests Palkhiwala.
Many investors believe it is a “cash cow” business, very little growth but healthy profit margin.
“From a market perspective, it’s definitely an accurate portrayal,” says Palkhiwala. However, even in a maturing market there are trends that favorable, he says. One is the shift to more premium phones needing more brawny chips at a higher price. “A great example is emerging-market TV,” he says. “There used to be there used to be one TV where fifteen people were watching, and now you have fifteen phones, each person watching the content of their choice.”
As a result, “The next time that person buys a phone, they're going to buy a phone that is more capable on audio and video experiences and movie experiences and which makes for a more expensive chip and revenue growth for us.”
Consequently, there may be more growth than the cash-cow theory implies. “We’ve said that consistently,” says Palkhiwala, “that these are factors that help us in a mature market to grow revenue.
“The proof is in the pudding,” says Palkhiwala. “If you look back over the last couple of years, there has been a significant increase in our weighted average selling price [for phones] as people would calculate it, outside of the share gains that we had.”
Temporary though the handset slump may be, the forecast offered last month spooked the Street, which has been used to the steady hand of Palkhiwala’s forecasting. The revenue outlook was twenty percent below consensus at the time, sending the shares down sharply.
I ask if the current market decline has forced Palkhiwala to change how he talks to the Street about the outlook. “You know, the philosophical approach to guidance is something that you keep forever, you don't change based on the environment, and that approach is really focused on transparency and reflecting the best-available information,” he says.
“To be honest, in talking to investors, that's what they would want from us anyways,” he says, “because there are things that we know, and we want to reflect that, there are things that we don't know on the macroeconomic environment, and we want to be transparent about that as well.”
A practical element of Qualcomm’s strategy is that it hinges on changes in markets such as autos being inevitable, and Qualcomm starting from a relatively small base of sales. “The fortunate part for us in the automotive business is we are a very small player, relatively speaking, on our way to becoming a very large player,” says Palkhiwala. The company “could achieve our forecast” for automotive revenue “without any growth in the total number of cars” sold, he says.
COMING TO GRIPS WITH EXPENSE
Even if there’s more growth down the road for QCT than the cash-cow theory suggests, what happens right now to the company’s ability to spend as QCT sales temporarily come under pressure?
Delivering last month’s gloomy quarterly forecast, Palkhiwala told the Street that “we will be decisive in managing operating expenses, especially if the downturn gets steeper.” That already includes a hiring freeze at Qualcomm.
How much more rigorous will the company need to be? Does a freeze turn into layoffs?
For the time being, more of a shifting of emphasis in the workforce, he says.
“In order to get our mix of businesses right and redirect investment, we have to make some changes in the skill sets we have,” says Palkhiwala. “So, it requires some amount of cuts.
“But overall, as a company, when you think about it, we gave very specific guidance on OpEx [operating expenses] for the year as well, or qualitative guidance on OpEx for the year, so we're planning to execute to that and, then, as we said, if we need to do more, we're willing to do it.”
More broadly, operating expenses are, of course, a balancing act.
“The way we think about it is, we want to invest in the growth areas for us, which is automotive and IoT,” he says. “What we're looking to do is reduce our investments in mature businesses and redirect those dollars to those areas.
“Second, is, we're still in this time frame of macroeconomic uncertainty and COVID restrictions, and so while we have that overlay in the economy, we want to be managing our expenses more carefully than we would normally, just because we don't know what we don't know.”
THE PROFILE OF PATENT LICENSING
If some investors believe, however incorrectly, that chip sales are now becoming a cash-cow, QTL, the licensing business, is the original cash cow, a royalty business with amazing profit margin of seventy percent or better. The multi-year plan, again, that Palkhiwala has articulated is for QTL’s rich margins to remain pretty stable.
However, there can be an occasional dip in profitability, as was the case last quarter. Such a blip can send investors scrambling to understand if they are seeing the start of something ominous. It is a passing phenomenon, says Palkhiwala of the slight margin decline of a couple percentage points.
“As the size of the market goes down it puts pressure on margins, there’s nothing beyond that,” he says, referring to the current handset slump. Volume of sales by licensors, the phone makers, directly affects the royalty stream.
On the plus side, the move to 5G wireless is going to show up in new patent opportunities for QTL. For example, in automobiles, connections from the car to the Internet are going to move over time to 5G from the current 4G modems Qualcomm has been selling, with perhaps five dollars per vehicle in new licensing payments.
When I ask about that prospect, Palkhiwala, careful to keep within his lane, steers clear of things that lie too far beyond the three-year forecast he has offered.
“It is definitely something that's going to boost the profile of QTL,” says Palkhiwala of 5G. “The timing of that is further out in the future, right? It doesn't help the shorter term, but it does create an opportunity for us as we look forward.”
THE CHINA ISSUE
Another current-market concern, one that is now a multi-year journey, the U.S. export restrictions on sales of chips by U.S. firms to China.
The U.S. Department of Commerce has now ratcheted up those restrictions during not one but two presidential administrations. The anxiety sometimes seems at fever pitch in the media and perhaps among investors.
Palkhiwala’s general tone, when I ask about the matter, is to present Qualcomm as somewhat at a remove from the most damaging restrictions. “We're in the consumer business,” he points out. "Some of the restrictions that have come up recently that were implemented do not impact us because we're focused on consumer devices, portable devices versus data centers.”
His point is that the U.S. seems acutely focused on those kinds of chip technology that can power existential technologies such as forms of AI developed on the biggest computers inside research facilities. Those data centers may connect to the edge Qualcomm is pursuing, but the edge could be viewed as technically a separate market at arm’s length.
Just as important, the Qualcomm edge market is a worldwide phenomenon, he adds. The implication is that there are plenty of other markets. “You go back to our original strategy, which is providing chipsets for the connected intelligent edge, which is really being fueled by digital transformation and cloud connectivity, and that's a global phenomena that will continue to remain in place.”
DO INVESTORS UNDERSTAND THE TECHNOLOGY?
The short-term stuff, stuff like China’s handset sales, or a couple points of margin, are, I’m convinced, the kind of hand-wringing that comes and goes. It seems to matter most to a kind of outer circle of investors who haven’t done much research on Qualcomm or its business model.
But one of the elements that I’m convinced is more significant is investors’ confusion about just how the new markets of automotive and IoT are playing out. Although the vision is in neon lights, as I said, there may still be elements of the details that need explaining.
“I'd say within IoT, we have some more work to do in providing more evidence of being on track and succeeding,” says Palkhiwala.
That’s good, frank, self-reflection. IoT seems perhaps the hardest part of the picture for investors to put together, and the hardest to articulate financially — not surprising, given it is not one technology, product category, or market. IoT is a grab-bag of stuff. The term is what AI scientist Marvin Minsky called a “suitcase word,” a word into which you throw just whatever you want. It was invented by industry to have a convenient way to talk about so many newfangled technologies relating to the Internet, and that can make it hard to get one’s head around it.
IoT includes lots of infrastructure that governments and municipalities have been deploying, things such as lamp-posts with wireless connections for remote control, and the ubiquitous camera systems on street corners. Amon has been talking about those opportunities for Qualcomm for years, long before he took over.
But more recently, more high-profile opportunities have included shipping millions of Qualcomm chips inside the Oculus 2 headset from Meta for virtual reality, and chips that compete with Intel processors in Microsoft Windows-based laptops.
In these newer markets, Qualcomm faces the same challenge that great chip companies always face, which is that they are at one or two levels removed from the end user.
In smartphones, over decades, Qualcomm managed to curry a little bit of brand recognition for its Snapdragon chips that powered the handset. Still, consumers cared mostly whether it was an Apple or a Samsung device.
In the case of VR, it is a sure bet that most users of Meta’s headset have no idea there’s a Qualcomm chip in there. Ditto for Microsoft’s Surface Windows tablet family. The recently introduced Surface Pro 9has the option of using Qualcomm’s processor and modem in place of the dominant Intel “Core” family of x86 processors.
The problems of being at a remove can be manifold. When I watched Amon a year ago onstage in New York talking about Mark Zuckerberg’s Metaverse, I couldn’t help but shake my head at the incongruity. The Qualcomm chips were doing good work to power a revamped headset for video gaming, and if video gaming was the sales pitch, everything would have been just fine.
But, as a partner, Amon and team are conscripted into the sales pitch for a bold new world of VR and augmented reality, and everything else, under the rubric “XR,” which really doesn’t exist. Qualcomm, in addition to supplying great chips, is carrying water for a market, the Metaverse, that is, essentially, vaporware at this point, and bound to be so for a long time.
The prospect of the Surface is more straightforward. The PC market wants competition. And Apple, which dumped Intel chips for its own parts, has provided a fine example for Qualcomm and its partners that there is a realistic chance of unseating Intel. The PC industry is a “great example” of Qualcomm re-deploying its mobile phone technology, Palkhiwala tells me.
“Apple made the transition from x86 to ARM-based architecture, really bringing their iPhone technology to bear in Macs,” he explains. “The same advantage exists for us where we can take the technology portfolio we created for phones and go disrupt a very large silicon market by extending that technology portfolio.”
Except that, again, there are some drawbacks to being at a remove from the end user. In this case, there are still growing pains for the Qualcomm PC effort.
One reviewer of the Surface Pro 9, The Verge’s Monica Chin, noted last month the Qualcomm version of the machine has “exceptional hardware,” and she lauded the inclusion in the Surface of built-in 5G connections — something the Intel-based model doesn’t have.
However, she also noted that aside from great video conferencing, “most other tasks you might need to do in a workday were frustratingly slow,” including very basic things such as chat and watching videos. Those apps weren’t written natively for the Qualcomm ARM-based processor, and so their performance was degraded, she notes.
“Windows on Arm is not ubiquitously unusable — but it is ubiquitously limited,” wrote Chin. Ouch. That’s the problem being a partner and not in control of the total product. In the case of Apple’s chips, when it dumped Intel, it made sure the Mac’s software and hardware worked together smoothly. Qualcomm can’t control the progress of Microsoft’s code nor how well it uses the Qualcomm silicon.
AUTOMOTIVE’S HEAD-SPINNING PROGRESS
If IoT needs a little help, the automotive market seems crystal clear, especially after the well-received September auto event in New York.
The greatest immediate opportunity at “the connected edge” for Qualcomm is the connected car. Amon and team have repeatedly emphasized that it is a once-in-a-lifetime opportunity for chip companies because the “content” in cars, the number and variety of chips, is surging dramatically as the things the car is supposed to do increase.
A decade ago or so, all the chips were for were anti-lock braking mechanisms and the fuel control system. Today, chips in cars are the eyes and ears of road sensing via technologies such as LiDar; the real-time wireless connection to the internet for the cockpit; the ever-more-feature-rich dash and central stack; the seat-back entertainment screens; the network of cabin sensors; and on and on.
Qualcomm is already well established selling modems to car makers for the network connection, and it is currently expanding into that digital cockpit that is sprouting more and more devices with connectivity chips and processors and display controllers.
The next step is ADAS, advanced driver assistance systems, everything from automatically nudging a car back into its lane, to, someday, a car that drives itself. The financial payoff of ADAS will start to become material starting in Qualcomm’s fiscal year 2026, Palkhiwala told the Street at the September event.
Oddly enough, the success in automotive is coming so fast, it may be leaving investors a little confused.
In particular, the September event had a big headline from Palkhiwala, the jaw-dropping revelation that Qualcomm has secured from car makers thirty billion dollars worth of “design wins” for future chip sales, meaning, statements of intention, if not formal contracts, for car makers to use the company’s chips.
That thirty-billion-dollar figure had been just thirteen billon a year ago, at the analyst day event. What’s more amazing is that at the time of the August earnings report, Palkhiwala had said it was only nineteen billion.
Thus, the wins with car dealers soared by eleven billion dollars “in just two short months,” Palkhiwala boasted at the September event.
Watching the room, it seemed to me Street analysts were fairly taken aback at the magnitude by which Qualcomm’s good fortune had surged. How do you go from nineteen billion dollars in commitments from customers to thirty billion in two moths? I asked Palkhiwala.
“These are deals we were negotiating for a long period of time,” he notes, and they just happened to close. “Sometimes, the timing just works out in your favor, and this was an example of that.” Had those deals happened earlier, August’s figure would have been higher than nineteen billion, he notes.
That particular time frame is less important, says Palkhiwala, than what the steady increase — first thirteen billion, then nineteen, now thirty — says about the company’s steady march to win over the auto world.
“What we're trying to convey is really show progress towards the revenue targets we set.”
Design wins are a pipeline of future business. The represent the company’s estimate of the future revenue from a particular chip in a particular model of car — a “socket,” as it’s known — which takes years to come to market.
“If the design value over the life [of the socket] is one billion dollars, and you start shipping in about three years” from the time the car maker agrees to buy the part, “you then ship it for another four or five years after that,” for an entire lifespan of seven or eight years.
“That billion dollars plays out over that period of time, and it informs our revenue targets.” In particular, Palkhiwala told analysts that the company now has ninety percent of expected revenue for the next four years essentially secured, or “covered,” as he put it, by that thirty billion of design wins.
That allowed him to raise the company’s revenue outlook for automotive. A year ago, he had told the Street to expect three and a half billion dollars in revenue from automotive in the company’s fiscal 2026, and now that was looking more like four billion. And the target offered a year ago for 2031, eight billion in automotive revenue, is now nine billion.
Interestingly, the ninety percent of revenue now covered by that pipeline is higher than the coverage had been at the analyst even a year ago, when it was just seventy percent of future revenue. Because I suspect the pipeline, and the coverage, can be mysterious, I ask Palkhiwala if there’s a goal to secure design wins to cover a certain amount of future revenue.
“It’s the other way around,” he says, “where you have the pipeline of design wins and it informs your revenue targets.” That strikes me as significant: deals are happening as they are happening, which seems more honest than concocting a stream of future business to produce a certain revenue forecast on a PowerPoint slide.
I point out that during the September event, analysts wanted to nit-pick Palkhiwala’s projections for the total market value of automotive, the “total addressable market” or TAM. Come 2030, he told the audience, the value of things such as the car connectivity chips, and the digital cockpit, and ADAS — that ever-expanding constellation of content — will total one hundred billion dollars in industry chip sales annually.
Some seemed to quibble with that.
“To be honest, this is like one of those things where everyone can come up with a different estimate,” he says. “The fortunate part for us, in the automotive business, is we are a very small player, relatively speaking, on our way to becoming a very large player.”
Although Palkhiwala estimates auto sales to calculate the design win pipeline, nevertheless, the design wins, and revenue, are happening as an effect of that expanding use of chips, not as a result of automobile sales growth.
“We could achieve our forecast” for automotive revenue “without any growth in the total number of cars” sold, he says. “The addressable market for silicon [in the car] is expanding, and then we’re picking up share within that market.”
“And so, that gives us the confidence that it's not really betting on the scale of the market.”
Amon, center, with Palkhiwala, left, and Nakul Duggal, general manager of the automotive division, during September’s automotive presentation to analysts. Palkhiwala, an engineer by training, and a twenty-one-year Qualcomm employee, often frames his financial talks by tying them back smoothly to the technology vision laid out by Amon.
IS ADAS SAFE AT ANY SPEED?
Aside from market size and timing, I have my own hesitancy about the automotive market, and it has nothing to do with Qualcomm’s acumen as an obviously highly skilled maker of chips.
No, my concern is the viability, for all parties, of ADAS, the part that Palkhiwala says becomes material for Qualcomm in 2026. Palkhiwala says almost sixty billion dollars of the hundred-billion-dollar market opportunity for Qualcomm, come 2030, will be in ADAS, and the company intends to win there.
Qualcomm may be just as talented at building ADAS as competitors Nvidia and Mobileye, the recently spun-out unit of Intel.
The problem is, I’m not sure key parts of ADAS will ever work.
The sense that I get from talking with people in the fields of AI is that a lot of self-driving hype is never going to lead to real product. In October, for ZDNet, Yann LeCun, one of the deans of today’s AI, told me in very blunt terms that he thinks the entire field of self-driving cars is on the wrong path.
“People working on autonomous driving have been a little too optimistic over the last few years,” LeCun told me. “The first thing you do is you build a demo where the car drives itself for a few minutes without hurting anyone.”
But then, said LeCun, “you realize you are never going to get there because there is all kinds of corner cases, and you need to have a car that will cause a fatal accident less than every two hundred million kilometers, right?”
LeCun’s conclusion is “You’re going to have to engineer the hell out of it,” by which he means, “if you have a large enough team of engineers, you might pull it off,” meaning, self-driving, “but it will take a long time, and in the end, it will still be a little brittle.”
That’s rather damning coming from a certified AI genius. Closer to home, even fans of Mobileye are concerned.
Writing to clients this past week, New Street Research analyst Pierre Ferragu, while endorsing Mobileye, noted that the technology of self-driving, industry-wide, “displays impressive reliability in the five nines” but “deployment remains challenging.”
Translation: it’s really hard to do stuff on the open road and avoid killing people more often than one every two hundred million kilometers, as LeCun suggests.
And so, I ask Palkhiwala, should I be concerned that Qualcomm, as talented as it is, is going down a path that is bound to be one of disappointment, pursuing a dream, self-driving, that computer science still has not figured out?
“I think it's actually a great question,” says Palkhiwala, “and it’s something that's a strength for us, let me tell you why.”
ADAS, he says, “is not a yes or no answer.” A lot of the press, says Palkhiwala, has been focused on the most extreme areas of self-driving, but that’s not necessarily where the market is at.
The U.S.’s National Highway Transportation Safety Administration, which oversees tests of ADAS on top of its tireless work investigating why car crashes happen, has described the six levels of ADAS. The NHTSA’s framework is everyone’s touchstone for talking about ADAS.
At the low levels, levels zero, one and two, the car is always driven by a person, and vehicle technology only intervenes in things such as an emergency situation, or to help periodically with braking or steering.
At Level Three, the guidelines specify, a car’s auto-pilot may take over while a person is in the driver’s seat, ready to jump back in. The extreme things are at Levels Four and Five, where there may be no person ever in the driver’s seat, just a machine all the time.
“People have this view that eventually we're going to have self-driving cars and the extreme vision of the industry,” says Palkhiwala. That is the yes or no answer, the binary, he alludes to. “And I think a lot of that vision, well, people are revisiting the timeline” in which it may happen.
Qualcomm, he says, “are very focused on, and most of our design pipeline is centered around, Level Two and Level Three,” those simpler functions of braking assistance or partially automated driving with human driver present. Those functions, “are actually a very achievable set of experiences,” he says.
“We are not a Level Four or Five company,” he continues, “and all of our competitors are actually just focused on those areas,” including Nvidia and Mobileye.
If Qualcomm is "a very small player” in automotive chips “on our way to becoming a very large player,” then it is, he says, because “we're focusing on the sweet spot of the U.S. market versus those [more-ambitious] parts.”
Again, the proof is in the pudding.
“You’re seeing the difference in design pipeline show up,” he says. Some of that thirty billion dollars is stuff that will be practical in years from now when it is on the road, such as Levels Two and Three ADAS.
As we round out our discussion of things that might be clouding investors’ view, there is also room to discuss what might be a secret weapon of the firm, something that I’m inclined to liken to the third dimension relative to investors’ Flatland horizon: M&A.
Not only is Palkhiwala an asset in thinking about the technological scope of things, he is also an astute custodian of one of Qualcomm’s favored means of growth.
Palkhiwala started at Qualcomm in 2001 in the M&A department, after a few years in private equity in Cleveland. “I think my plan was to get as far from Cleveland as possible, and San Diego qualified,” he recalls.
“We've had a very successful M&A program,” Palkhiwala observes.
The company has a long history of buying important assets that it then builds into meaningful franchises. That was the case with the three-billion-dollar purchase of radio frequency assets from TDK in 2019 that cemented Qualcomm’s position ahead of Qorvo and Skyworks in RF. Big deals have tended to be few and far between, such as the abortive bid to buy auto-chip maker NXP Semiconductor for fifty billion dollars in 2018.
The TDK assets are a good example of how Qualcomm does successful M&A, Palkhiwala says. “They were number four, number five in the industry, we made them number one,” he says, meaning, RF share that had trailed Qorvo and Skyworks. “So, we had an organic plan to expand” into RF “but we were able to do it faster because we made an acquisition.”
The PC market that is so tricky with Surface Pro 9’s limitations has the intriguing prospect of getting a lift from a $1.4 billion acquisition last year of a conspicuous Silicon Valley startup, Nuvia. The company is staffed with stellar former Apple engineers, including Gerard Williams, who lead Apple’s move from Intel to its own Mac chips.
The company’s work prior to the deal, a chip code-named Phoenix, has an ability to operate at very low power levels that “are well-suited to mobile devices,” wrote longtime chip observer Linley Gwennap of the Microprocessor Report last year.
The Phoenix chip could boost Qualcomm’s high-end smartphone processors, but also, “could boost Qualcomm’s efforts in laptop PCs, enabling the company to create a chip similar to Apple’s M1 but for Windows systems,” wrote Gwennap.
Says Palkhiwala, Nuvia “is going to allow us to accelerate our entry into the PC space, among other things.”
The other prominent recent deal is Arriver AB, a software maker for those ADAS functions in the car. Arriver is an intriguing example, like the TDK deal, of Qualcomm stepping in at the right moment in time to pick off choice assets. The company is a carve-out from Veoneer, a Swedish firm that up until May of 2021 had traded on the New York Stock Exchange. Veoneer had been the electronics unit of Stockholm-based auto parts supplier Autoliv and was spun out as a public entity by that company in 2018.
Qualcomm teamed with New York investment firm SSW Partners to take Veoneer private for four and a half billion dollars last year, under SSW’s ownership. Qualcomm then turned around and bought the Arriver software unit from SSW, leaving the rest to SSW.
When you think about that multi-year chain of deals — Autoliv to Veoneer to Arriver — the deal is a window into Qualcomm’s ability to zero in on just what it needs to complement what is already in-house at Qualcomm. Rather than take whole companies, Amon and team artfully pick and choose the parts they want.
“We have a very strong chipset roadmap for ADAS,” says Palkhiwala, "and Arriver brought software assets that really are driving a large portion of our design win pipeline in automotive.”
Since taking over Arriver roughly six months ago, Qualcomm has been busy developing a software “stack,” meaning, several levels of capability that are optimized to run on the Qualcomm chips. The focus is “computer vision” applications of machine learning that will advance the ADAS functions. That work has proceeded in close collaboration with Qualcomm’s auto customers such as BMW.
In other words, with Arriver in autos, Qualcomm has a chance to meld hardware and software closely unlike the way that it has been at one step removed from Windows users in the PC market.
Those two acquisitions are proof, says Palkhiwala, that the strategy to enhance Qualcomm’s own abilities works well. “In the context of that strategy, there are certain — given where the asset values are today — opportunities for us to continue to do that,” he says.
M&A will focus “almost exclusively” outside of the handset area going forward, he says. “It’s more about things that fit into our diversification plan” in autos and IoT.
SO, WILL INVESTORS GET IT?
To return to my original question, At what point will investors get it? How many audiences with Palkhiwala does it take for them to understand the value of the New Qualcomm? You might think the third time is the charm, and investors have now had more than a few encounters.
“To be honest, the way I think about it, is, it’s not my job to speculate on that — I’d be a lot richer if I were doing that well,” he says.
"We've given a set of commitments to investors, and our focus is on executing and exceeding those targets.”
Among all the potential areas of clarity and confusion in the Qualcomm story, there are things the company can control, and those it cannot.
“While there are short term cyclical challenges that the entire industry is facing, the long term secular trends are still very much in our view,” Palkhiwala reminds me.
“We're on this journey to prove our ability to diversify away,” he says, recalling the initiative that began with Mollenkopf but that has become clearer under Amon. “And I think we've made a lot of progress in a year” since the November analyst day meeting in New York.
Of course, an explicit stump speech for a stock is something that executives from time to time can offer. And so, I ask, at a recent price around $125, and with its below-average forward sales multiple, is the stock a good buy?
"I have a lot of confidence in the technology of the company we have, with the opportunity in front of us because of digital transformation and cloud connectivity,” says Palkhiwala.
"And so, I’m bullish on what's in front of us, and that's how I'm focusing on my work.”
Nutanix CEO: Given all that’s going on around us, we’re happy with how we’ve done
Dec 01, 2022
Among this evening’s positive earnings results, one deserving mention is Nutanix, whose CEO, Rajiv Ramaswami, was kind enough to talk with me following the report via Zoom, as he has in past.
Tonight’s results are consistent with what he and I talked about then, namely that the company’s able to continue to deliver better-than-expected revenue growth despite the broad-based weakness we keep seeing in the software world.
“Our quarter performed even better than our guidance on all the metrics; we kept our revenue guidance for the rest of the year; [and] we raised our operating income and free cash flow guidance for the rest of the year,” is how Ramaswami sums it up.
“Given everything else going on around us, we are very happy with how we’ve done.”
The stock rose initially in late trading, and then slumped a bit. Nutanix has been a great performer this year, down just eleven percent.
The surprise this evening in the company’s fiscal first quarter report was a positive non-GAAP operating profit for the first time, which was a big surprise because Ramaswami had told the Street back in August to expect a negative operating profit margin of about six percent.
Revenue beat expectations, and on top of that, Ramaswami has been keeping a rein on expenses, which is leading to that positive surprise on the bottom line.
“It’s continued discipline in terms of how we manage our expenses, and we will keep that going forward,” Ramaswami tells me.
Indeed, the outlook for this quarter is for that operating profit margin to expand to five percent to ten percent. For the full year, Ramaswami expects the margin to be positive two percent to four percent.
“We know what we can control” in terms of operating expenses, Ramaswami tells me, “and we have a history of controlling that in the time that I’ve been here, so we’re very confident about the ability to be able to manage that.”
The revenue outlook for this quarter is slightly higher than consensus, $460 million to $470 million versus the average $458 million, but the company kept its outlook for the full year the same, which is just in line with consensus.
The reason Ramaswami is not increasing the outlook is because of the unknown pace of signing new customers. He’s said for the past couple quarters that signing new “logos,” as it’s known, has an element of uncertainty given the macroeconomic situation.
Getting new customers is the most expensive, the hardest, and the most uncertain part of the business in a shaky economic climate.
Despite keeping a tight rein on expenses, “We are not short-changing our R&D at all,” says Ramaswami. “I will keep us on the high side” of R&D as a percentage of revenue, he says, "because, for me, one of the key things here is continuing to invest in innovation.”
Results last quarter were consistent with that cautious view, but the company still added customers, the total customer count rising by twelve percent last quarter, to 23,130 total customers, the same rate of growth as the prior quarter.
That’s actually a little bit better than is typical for Nutanix at this point in the fiscal year, says Ramaswami. Overall, he says, “We’re fine with where new customers are tracking from a global perspective, and also the quality of the logos that we get, and the initial size of the deal,” says Ramaswami.
“I’d love to get more of those VMware customers coming to us,” says Ramaswami, referring to what he has told me before is an opportunity to poach given VMware is being bought by Broadcom. “But it takes time,” he notes, to pursue those prospects and to woo them.
Renewals, on the other hand, continue to be “strong,” he says, with the overall “retention rate” in the vicinity of ninety percent.
When it comes to the improved profit outlook, I offered to play devil’s advocate. I asked Ramaswami, if the company sees lower operating expenses going forward even as its revenue outlook stays constant, is he potentially under-investing?
“That’s a great question,” says Ramaswami. “I can tell you that — we do a lot of benchmarking in terms of what our R&D is as a percentage of revenue relative to other companies — we are on the high side, and I will keep us on the high side because, for me, one of the key things here is continuing to invest in innovation.”
“We are not short-changing our R&D at all.”
I asked Ramaswami if any of the analysts asked him this evening about the take-out rumors in The Wall Street Journalin October.
“There was not a single question about that,” he says. “Again, my answer is, it’s not for me to comment on rumors and speculation, but we are very much focused on running our business.” (The day brought another speculative article, this one from Bloomberg’s Liana Baker, Katie Roof, and Scott Deveau), saying that Hewlett Packard Enterprise is interested in Nutanix and has had talks with the company in recent months, citing multiple unnamed sources. We did not discuss that article.)
Before parting, Ramaswami gave me a tip on his current reading interest: Behavioral economist Richard Thaler’s Misbehaving: the making of behavioral economics, published by W.W. Norton in 2015. Ramaswami recommends the book.
“He talks about how economists generally assume people are very rational, but that’s not the case: people do a lot of things based on emotion.
“You have to factor that in, otherwise your economic models are all off.”
Pure Storage delivers while Snowflake’s cautious outlook disappoints
Dec 01, 2022
This continues to be a quarter of uneven performance in the face of slumping corporate technology buying.
Following the painful example of cybersecurity vendor CrowdStrikeon Tuesday evening, which missed expectations with its forecast for revenue for the first time ever, and saw its shares plunge nineteen percent, Wednesday evening brought a very mixed bag of quarterly results from vendors selling to enterprise.
On the bright side, Pure Storage, vendor of flash-based storage equipment and accompanying software, one of the TL20stocks to consider, beat expectations but missed slightly with its revenue outlook. No one seemed to mind, as the overall report and comments tonight were much better than last week’s depressing outlook from competitor Dell Technologies.
Security software vendor Okta is up fifteen percent on a clean beat and raise quarter and outlook.
But Snowflake, another TL20 pick, while beating expectations, said it is taking a cautious approach to its outlook. And so, its forecast for this quarter’s product revenue was the biggest miss relative to consensus since the company came public in September of 2020.
First let’s talk about Pure. Its revenue outlook this evening of $810 million for the January-ending quarter is just slightly below the average estimate for $813 million. That’s the first miss on outlook in two and half years, back to the “COVID” quarter of April 2020.
However, the call was upbeat, and the sense from the analysts’ posture during Q&A is that everyone is happy the wheels have not come off at Pure like the way they did last week for Dell.
Pure CEO Charlie Giancarlo highlighted positives such as the company’s “annualized recurring revenue,” a measure of the contracted value of deals stretching out twelve months in time. Specifically for the category of subscriptions, ARR surpassed one billion dollars for the first time ever. Pure prices in different ways, but there’s an emphasis on selling more and more on a subscription basis, so this is an important milestone.
One, the company sees the declining price of NAND flash chips generally, industry-wide, as helping the company sell more gear to replace traditional disk-based storage for bulk data storage known as “near-line.”
Said Giancarlo, “We expect that the currently anticipated improvements in Pure's NAND economics this coming year will enable Pure to deliver our TLC-based products at prices competitive with most near-line disk arrays on a total cost of ownership basis,” he said. “We believe strongly that the days of the hard disk in the data center are over.” Bully for that.
Second, Giancarlo intimated that, as he told me, he foresees more of the large sorts of deals the company did last year with Meta Properties for that company’s gigantic Research Super Cluster for artificial intelligence processing.
Said Giancarlo, “In terms of other hyper-scalers, our conversations continue where we're optimistic that we will see realizable opportunities there,” though he added, “But, again, too early to be able to put any real guidance on that.”
Regarding the broader outlook, Giancarlo told analysts he thinks IT spending will hold up in 2023 despite possible recession. “The way we're looking at it, is, a roughly flat US economy next year and perhaps a slightly recessionary international economy, obviously varying a lot country-by-country,” said Giancarlo.
“And as we go into that, we're seeing IT spending holding steady, maybe slightly up relative to the overall GDP growth.”
Now, that was a lot better than Dell CFO Thomas Sweet last week telling analysts “these dynamics are creating a broader range of financial outcomes for our upcoming fiscal year.”
In fact, I would say Giancarlo’s outlook is so calming, relatively speaking, his remarks sounded like the kind of upbeat stuff John Chambers, former Cisco Systems chief, used to dole out during conference calls.
I should note that Giancarlo told analysts that despite the “challenges and uncertainties of the current business environment, we remain confident in our ability to take share and outpace the market.”
Asked by analyst Amit Daryanani of Evercore why Pure’s outlook was so “impressive relative to peers,” especially Dell, Giancarlo remarked that it has to do with having a better lineup of product:
It’s based on a much broader portfolio we believe, you know, going from our roots, our initial product, which was you know, block-oriented, to now having file- and object-based systems. And now starting to pursue replacements for secondary tier disk alternatives. So, this allows us to expand into a lot of market adjacencies and allows a lot of elasticity in our market as flash prices decline.
Pure Storage shares, with tonight’s slight gain to $29.83, are down nine percent this year, and up fifteen percent since picked for the TL20.
Over at Snowflake, things were not as thrilling or confident, though not bad by any means.
Snowflake continues to have astounding revenue growth at scale. The company’s revenue for the fiscal fourth quarter of $557 million rose by a whopping sixty-seven percent, year over year.
And the company saw a big surge in customers, especially those spending a million dollars a year or more. The “retention” rate, the measure of how much customers spend versus what they spent a year earlier, was one hundred and sixty-five percent, extraordinarily high relative to most software companies.
So, Snowflake continues to find more takers for its software and it continues to squeeze a lot out of existing customers. Pretty great. And, Snowflake raised its outlook for the full year’s free cash flow, on an adjusted basis, to twenty-one percent of revenue, up from the seventeen percent it had offered back in August.
However, one number in the outlook is awful, relatively speaking: Product revenue. Snowflake doesn’t forecast total revenue. Instead, it forecasts just the portion it makes from use of the product, as opposed to professional services. Product revenue tends to be about ninety-five percent of total revenue most quarters, so it’s a pretty good proxy.
The company forecast this quarter’s product revenue to be $535 million to $540 million. That is three percent lower than consensus for $553 million, according to FactSet. Most quarters, Snowflake’s outlook tops consensus. But even the few times the company’s product revenue forecast has missed, it’s been perhaps one percent at most. Ergo, this is the worst miss for product revenue forecast since the company came public.
Like CrowdStrike the night before, the current economic climate is producing a period of “firsts” for some software vendors, and not in a good way!
In explaining the outlook on tonight’s call, Snowflake’s CFO Michael Scarpelli related to analysts how “over the past six weeks, we have seen weaker consumption in Asia-Pacific (excluding Japan], and SMB [small and medium business] segment.”
The term “consumption” refers to Snowflake’s method of invoicing customers. Snowflake, you’ll recall, bills customers not at a pre-ordained time, like the beginning of each quarter, but only as they use the software, as they “consume” it. That means revenue has an unpredictable element.
On the one hand, Scarpelli said that, “Recent consumption patterns give us confidence that our largest and most strategic customers will continue to grow.”
On the other hand, he said, “With the holidays approaching and uncertainty with how customers will operate, we believe taking a more conservative approach is responsible as we resource plan for Q4 and fiscal 2024.”
Basically, Scarpelli is saying that the company just got a lot more cautious about how unpredictable the consumption pattern will be.
On top of the forecast miss, the preliminary outlook Scarpelli offered for next fiscal year is also lower. The company’s assuming product revenue growth rises forty-seven percent for the full year, but the Street has been at fifty-one percent.
One analyst, Sanjit Singh, calling in for Keith Weiss of Morgan Stanley, asked Scarpelli if he could be sure growth wouldn’t actually be even lower next year. Scarpelli replied that the company “have a number of significant customers that we have signed up, that we see them ramping up next year on Snowflake.”
So, hopefully, all that good stuff about growth in customers is going to at least support the outlook.
One positive comes with slowing growth: Scarpelli said the company is going to slow hiring next year, even as it adds another thousand employees. The result will be a higher free cash flow margin of twenty-three percent, he said.
Among the notes out this evening, Sterling Auty with MoffettNathanson writes that although Scarpelli has a good track record of forecasts as a CFO, “investors are likely to debate the preliminary product revenue outlook for fiscal 2024 in terms of how reasonable it might look.”
Auty notes the stock has “the highest valuation in our coverage,” so it’s bound to “take a hit” on this lowered outlook.
Still, he argues, Snowflake is “a unique asset and it is unlikely to trade at a cheap valuation.”
Snowflake stock, with the decline to the after-hours price of $135.39, is down sixty percent this year, and down eight percent since being picked for the TL20.
CrowdStrike plunges nineteen percent on first-ever quarterly forecast miss
Nov 30, 2022
CrowdStrike, the cybersecurity technology maker, had not missed a forecast in three years since it came public, until tonight.
The shares are down nineteen percent in late trading after the company’s forecast for this quarter’s revenue came in two percent below consensus estimates for the fiscal fourth quarter ending in January. The company also gave an early indication that its revenue for next year will come in lower than expected.
CrowdStrike is most famous as being the firm working for the Democratic National Committee in 2016 that asserted that Russian operatives had hacked a server of the DNC.
The story this evening is a familiar one now in software circles: slowing deal activity in software land, and sales getting “pushed out.”
In prepared remarks, co-founder and CEO George Kurtz said that the company’s “net new ARR,” a total for contracts in the forward twelve-month period, “was below our expectations as increased macroeconomic headwinds elongated sales cycles with smaller customers and caused some larger customers to pursue multi-phase subscription start dates, which delays ARR recognition until future quarters.”
On tonight’s call with analysts, Kurtz gave more detail. He noted a particular weakness among smaller companies, the non-enterprise types. Some smaller firms were asking for extra time to sign a purchase. That both reduced the amount of ARR signed in the quarter, and also reduced the number of “new logos,” meaning, new customers, that CrowdStrike gained.
A total of fifteen million dollars worth of deals were “pushed out” of the quarter, said CFO Bert Podbere. Most of the company’s new business tends to comes from those smaller companies. “When you think about fifteen million in that space, and what it means in terms of logos, well, you can do the math,” said Podbere.
The ARR for the quarter was “weighted more heavily,” toward the companies spending a million dollars or more a year with CrowdStrike. Kurtz said those larger companies, the enterprises, continue to prioritize his company’s software, “but some also had to manage timing issues related to OpEx [operating expense] budgets and cash flow amidst the rapidly evolving macro,” which meant that they "signed contracts that have multiphase subscription start dates,” which delayed, again, the ARR the company was able to get.
Kurtz added that the traditional “budget flush” that happens in the fourth quarter of the year won’t be happening this year among his customers, he expects.
Despite all that, Kurtz noted “strong inherent demand for our products” and reviewed many positives in the quarter.
When it came time for CFO Podbere to offer an outlook on things ahead, he told analysts the first half of the fiscal year starting in February will see more of that ARR “headwind,” and that as a result, “This would imply a low 30s [percentage] ending ARR growth rate, and a subscription revenue growth rate in the low to mid-30s for FY ’24.” That is below Street consensus for the company to have revenue growth for the year of thirty-seven percent.
On the plus side, Kurtz made the case that in his chats with large customers, “budgets are not in the enterprise getting cut […] we just haven’t seen it.” Moreover, he said that customers are looking to consolidate spending, and “they’d rather spend it with fewer vendors,” adding, “and I think that’s where CrowdStrike shines.”
Podbere was asked if any of the gloomy outlook will hurt the company’s cash flow. No, he said, “from a cash-flow standpoint, we see a path to 30% free cash flow margin next year,” adding, “I think that just goes back to the strength of the model, and the fact that we've got this business that is really durable.”
Podbere was also asked if he thought the company might have a leg up because it’s bigger than some competitors, to which his reply was a resounding “yes.”
“We actually see this as a great opportunity for CrowdStrike as we go forward,” he said, “as smaller competitors fall by the wayside, as private companies look for exits, we think it's a very attractive opportunity for us with our balance sheet, almost $2.5 billion in cash.”
“And at the end of the day, as these macro trends evolve, we see a great opportunity for us now into the future to continue to consolidate customers as well as other technologies that might fit within our platform.”
Shares of CrowdStrike, at tonight’s after-hours price of $111.82, are down forty-five percent this year.
The TL Podcast for November 27th, 2022: Analog Devices is a star, Dell is a downer, stay tuned for Qualcomm
Nov 28, 2022
TL20 name Analog Devices (ADI) is a different kind of chip company, much to its benefit; Dell Technologies (DELL) gives us a glimpse into a creepy 2023; and I’m about to interview TL20 name Qualcomm’s (QCOM) CFO Akash Palkhiwala.
Cambium CEO: 5G progresses, and the software story emerges
Nov 28, 2022
This month was yet another milestone in the turnaround of Cambium Networks, the wireless equipment provider that has been on a comeback trail after being severely hampered by the supply chain mess.
Cambium was a star in 2020 and the beginning of 2021, before being hit hard by a lack of parts. As I wrote in August, Cambium’s situation has been improving as it came to grips with the supply chain mess.
On November 2nd, Cambium's third-quarter report cemented the turnaround, with sales beating expectations by nine percent. CEO Atul Bhatnagar was kind enough to speak with me via Zoom. I offered some preliminary notes on that day, but I wanted to revisit our conversation because it has lots of gems about what’s going on in broadband infrastructure around the world.
As Bhatnagar told me back in August, the company appears to be at the beginning of a multi-year growth phase. That is because products that took years to develop are just now coming to market with multiple opportunities over multiple years to come.
“I think the most important thing to take away is that we are well-positioned for solid growth in 2023,” was Bhatnagar’s main point when I asked him November 2nd how he would characterize the earnings report.
Cambium makes a variety of wireless networking equipment used by both enterprises and by smaller service providers to to provide hundreds of megabits per second of wireless networking either inside an office or over several kilometers of a campus or city environment.
Last quarter, products that already had great momentum got even stronger. In particular, the centerpiece of the company’s results for several quarters now have been the sales of equipment for enterprise networks, including WiFi 6, the latest version of the WiFi networking technology standard. Sales in the quarter nearly quadrupled, year over year, and the company increased its outlook for the full year for enterprise products to rise by fifty percent versus its prior expectation for sales to rise by forty percent.
One of the reasons the products are taking off is they’ve become a hit with service providers that are extending broadband to multiple dwelling unit [MDU] residential buildings, and in hospitality environments, such as hotels, to extend WiFi out to the pool, say.
“They value the simplicity,” Bhatnagar says of service providers in both markets.
“If you look at a typical hotel or MDU, people want high-quality broadband, they want high-quality WiFi, they want high performance, and they also want ease of deployment, because if you're building a campus, if you’re building apartment buildings now, there's so many WiFi devices, it's not just Wi-Fi access points, you also have to manage a large number of devices in the building.”
The Cambium software on the equipment makes it easier to manage all those devices that become part of the mix, says Bhatnagar.
“For example, if you go to hospitality, and they have digital locks on the rooms, we have APIs [application programming interfaces] that make it easy for them to handle the digital lock management via WiFi,” essentially, locking or unlocking rooms remotely from the front-desk.
“We are also getting traction in healthcare for senior living types of places,” says Bhatnagar. “Those verticals are emerging as very strong for us.”
An interesting insight into the market for 5G is that while mobile use of 5G on handsets is a global story, the “fixed” use of 5G, meaning, a tower set up to deliver last-mile service, is happening all over the globe except for in North America.
“5G adoption is happening in a lot of cases internationally faster because the spectrum availability internationally is greater,” says Bhatnagar.
For example, the company last quarter had one of its largest contracts ever for what’s called “point-to-multi-point” radio systems, in the Caribbean. These are radio systems that a service provider can attach to a utility pole, for example, to give people local Internet access over a kilometer or more from a switch. Service providers in Latin America, says Bhatnagar, are using those radios to provide hundreds of megabits of local access for the first time in those markets.
“It was in one of the islands in the Caribbean, and the islands are pretty interesting places,” says Bhatnagar, "because usually they are rocky or they have the terrain where wires don't work as well, so wireless is the right technology.” In those markets, just as in the U.S., “broadband everywhere has become a lifeline for health care, for students studying at home, or for working from home, and customers are demanding hundreds of megabits [per second of bandwidth].”
“We are not catering to very high-end service providers where the sales cycle itself is two years,” observes Bhatnagar of the 5G build-out internationally. Instead, he says, “We are going to mid-range service providers, and they are innovative, they move a little faster, they adopt new technologies a little bit more expeditiously.”
This kind of large deal in Latin America is the validation of 28-gigahertz point-to-point radio systems that Cambium has been working on for years. “Those proof of concepts are turning into production networks, and the deployments will happen for the next three, four years,” says Bhatnagar. “Every year, they'll buy multiple millions [of dollars worth] in products in terms of what they'll spend with us — over time, these are substantial deals.”
The same trend to build out broadband for the first time is happening in other international markets, such as in North Africa. What helps Cambium is that smaller service providers in these markets are moving much faster to roll out last-mile fixed wireless service than the incumbent telcos typically do.
“We are not catering to very high-end service providers where the sales cycle itself is two years,” observes Bhatnagar. "We are going to mid-range service providers, and they are innovative, they move a little faster, they adopt new technologies a little bit more expeditiously.”
In the U.S., the hot market at the moment is point-to-multipoint systems using spectrum at 6 gigahertz, commonly referred to as “C-band” spectrum. Cambium has new radio systems that are coming out this quarter to serve that market.
“The U.S. has released 800 megahertz of spectrum in that 6-gigahertz band,” he explains. “The 5-gigahertz band has a lot of usage, and a lot of noise, so this is additional spectrum to relieve the overcrowding.”
“That’s the product we are releasing this quarter, there are ten trials going on, and we are seeing excellent, excellent performance,” he says, including an ability to transmit almost two gigabits per second of bandwidth at a distance of over two miles.
“We are excited that for many wireless internet service providers in the U.S. who are used to the 5-gigahertz band, this will be similar to what they are used to, the U.S. will lead, and many other countries will follow, and 6-gigahertz will relieve the pressure on 5 gigahertz.”
“It will be a mainstream broadband access connectivity” technology, he predicts.
Might I, I asked Bhatnagar, be able to dump my local cable provider in New York City and use this instead? “Absolutely, I think you will see this broadly adopted across the board,” says Bhatnagar. “It’s like bringing a four-lane highway to your neighborhood.”
Emerging from all of these wireless equipment markets is a fascinating software story. Cambium sells software that it provides as a cloud service to manage all the equipment it sells, including other vendors’ equipment. It doesn’t disclose the exact dollar amount of software sales per quarter, but it does tease investors with high growth rates.
The most feature-rich version of the Cambium software, called “cnMaestro X,” saw sales more than triple last quarter, the company said. That growth is a product of some very large deals. In Asia-Pacific, Cambium signed a deal with a service provider to put 20,000 fixed broadband “seats” under management by the cnMaestro X software. Cambium’s total number of devices under management with cnMaestro across the board was 866,000 at quarter’s end. This is starting to become a substantial fleet for the company.
I asked Bhatnagar when he will disclose dollar figures for software sales. “We've been working on software for about eighteen months or more, and we are beginning to report growth numbers,” he says, “and I think as it becomes a sizable contribution [to revenue], you'll see us get the dollars on it,” meaning, reporting revenue.
In the meantime, “Our gross margin is improving because software is beginning to contribute,” observes Bhatnagar.
The new elements of software have been new features. One is called “cnHeat,” which provides a “heat map” that tells a carrier how to orient radio systems to transmit the strongest signal to the customer. Another new feature is telling the service provider which applications are consuming the customer’s home broadband.
“If a customer calls them, the service provider can say, ‘The reason you're seeing very low performance right now in your broadband connection is because there's gaming going on in your house, and that's sucking up this type of data’.”
“So, we are beginning to add the sophistication of application visibility, and our key message is, We can give you a solution so you can deliver an exceptional digital experience.” Or at least deal with irate calls, I suppose.
One topic that is intriguing but that we didn’t have time to dig into is the growing military business of Cambium. This quarter was the highest quarter in the company’s history as far as new orders from defense outfits, which includes governments around the world that are interested in stepping up communications on the battlefield.
“The national security in every country, every region, has been ratcheted significantly in the last year,” says Bhatnagar. “Every war theater is realizing that effective communications technologies are very key in this modern era.” The Cambium equipment is “blast-proof,” he notes, and certified in multiple respects for security and other military concerns.
Especially in demand now are so-called point-to-point systems that send a wireless signal upward of fifty kilometers.
“We anticipate very good business next year in defense, and a strong second half this year,” says Bhatnagar.
Cambium stock is down twenty-one percent this year at a recent price of $20.60, and up eight percent since the report.
Confluent’s CEO sees a vast, expanding world of streaming
Nov 27, 2022
Every time software maker Confluent has to report quarterly results, its co-founder and CEO, Jay Kreps, has to play a bit of the storyteller. His prepared remarks on the earnings conference call often lead with a tale about the trends in technology, like a miniature seminar for Street analysts.
“I do that first part myself, and I put a lot of effort into it because we are a genuinely new category,” Kreps told me this past week in a meeting via Zoom, regarding how he scripts his prepared remarks. Earlier this month, for the third-quarter earnings call, Kreps told a tale about how the Confluent technology is emerging as a “Fourth Estate” of data, a concept that was pretty deep as financial conference calls go.
“If we were the hundred-and-first database [program], maybe it wouldn’t be necessary,” Kreps tells me. “But this is genuinely a paradigm shift, and you know, What even is it?”
Indeed, what this stuff even is, is a question I’d wager most investors can’t answer, even those who own Confluent. In the year and a half since Confluent came public, Kreps has been eloquent and authoritative in his seminars, but the abstruse nature of the software means that the What, and even more, the Why, probably still elude people.
In brief, Confluent sells its own version of an open-source software program whimsically named Kafka for the Czech author Franz Kafka. The software was invented by Kreps and colleagues when he was at LinkedIn a decade ago. As I explained in my first interview with Kreps, in September of last year, the Kafka program is a form of “middleware” software — basically, plumbing for data. Kafka moves data to where it is needed by a given application.
Kreps calls it “data in motion,” and you can think of it as a way to “stream” corporate data from one app to another — kind-of like how a person streams Netflix shows from a server to their PC or phone.
A better definition of Kafka is that it is a diary of all the things that happen in a computer system. Imagine you’re on The Technology Letter home page and you click the button labeled “Subscribe to updates.” The form asks you to type in your email address and push the “Sign Me Up!” button.
Typing your email address and pressing the button constitute an “event” in the parlance of Kafka. A new entry is made in the Kafka software diary. Bing! So-and-so signed up. The Kafka software then notifies other programs in the computer system that need to know about that event. For example, the mailing list program needs to add your address to the list. And the email program needs to send a welcome email to you as a new subscriber.
I’m imagining this because I have no actual knowledge of how Squarespace, the host of this newsletter, runs their site or if they use Kafka. But it would make sense. Notifying programs of a new event is a useful function because, surprising as it may seem, programs in a computer system don’t automatically know that events happen; someone or something has to make that connection for them.
Confluent is in the business of spreading the use of Kafka as a kind of universal diary that underlies all applications. Every single program would be tied to Kafka as a hub that ferries notices of events in and among and between those apps — dozens, millions, even billions of digital events every second, be they events on a Web page such as in the example above, or the change in status of a piece of heavy equipment somewhere in the world that is connected to the Internet for monitoring.
The reason that Kreps keeps having to educate investors is that Confluent’s financial results continue to defy reason, so that investors struggle to understand the business. Usually, recently public companies have a sophomore slump, when the streak of good news runs out. Even at the seeming precipice of recession, that hasn’t happened to Confluent — yet.
The November 2nd report was the sixth quarter in a row since Confluent’s IPO in which revenue and net loss per share beat Street expectations. And it was the fourth time in a row Confluent raised its revenue outlook for this year. That doesn’t jibe with investors’ sense that budgets are getting tighter and software deals are getting “pushed out,” something many vendors have commented on.
That leaves Kreps where he was in August, when last we spoke, explaining why his company hasn’t stumbled. “Investors are trying to figure out: there's a lot of exciting trends in tech, and which of those are, kind-of, durable things where there's real value, and aren’t going away,” says Kreps.
“The reality is, we've seen continued growth and strong performance, and strong net retention with customers,” says Kreps. “I think that's probably due to a number of things, not the least of which is, right now everybody's focused on efficiency, and for a lot of organizations, these cloud services are an easy way of just doing things with fewer people.”
Confluent sells its version of the Kafka software as both a right to use “on premise,” in a company’s own facilities, and also as a managed service inside public cloud computing facilities. The latter was thirty-eight percent of revenue in the most recent quarter. Cloud revenue is rising even faster than Confluent’s high overall growth, a hundred and twelve percent last quarter, year over year, versus just forty-eight percent growth for total revenue.
It turns out that Kafka software is challenging for companies to implement on their own, which may be driving companies to hand over the work to Confluent as a managed service in the cloud.
During Confluent’s user conference in October, Current, a customer representative, Andrew Hartnett, the head of engineering for software maker New Relic, told the audience his company switched from running Kafka internally to using Confluent’s cloud because doing it themselves was getting prohibitively complex. “Unless you are prepared to spend a massive amount of money on large teams that support Kafka 24/7, it's very, very difficult,” said Hartnett.
“There is a ton of open-source software, and a lot of companies are asking should I build versus buy,” observes Kreps. “There are a lot of great open-sourced managed services now,” he says, “and times are just tighter, and companies are looking at all the projects they have to get done, and they're realizing, we're not going to have as many new engineers to go do it as we thought.”
More broadly, the Confluent software is the kind of stuff that is mission-critical, he insists. “There's two things that matter in tighter times,” he says. “One is what are the projects that really let us make more money, and the other is what are the projects that improve our operations and help us save? To some extent, every investment is lined up to one of those in any time.”
“But,” says Kreps, “the connection has to be pretty tight, not three steps away,” meaning, the connection of what the software can do to those two goals has to be obvious and immediate.
“Maybe it’s the nature of the real-time streaming area, it tends to be associated with the actual activity of the business,” he says, “like, interaction with customers, how you make money, how the business operates, because it’s things happening right there versus the twenty-seventh report you get at the end of the week, which, maybe helps the business somewhere, but it’s three steps out.”
Kreps is poking fun at a lot of business software that does stuff like generate reports, which has always been of dubious value. “We tend to be attached to these projects that are operational, and that’s why they don’t get the axe,” he contends.
All of which is fine, but, I ask, what about a prospective customer who has not yet written a check for the software? What’s going to convince them not to hold off on implanting Kafka given it seems a big commitment? What will prevent new business from crashing in a recession?
When I present this concern to Kreps, he replies that the canny strategy at Confluent to lure new business is not to present the intimidating vision statement of a vast, universal diary, but to start small.
“It’s important to have a big picture view of your role in the organization, but the way that you get there is not by selling customers some big picture of a central nervous system across everything,” Kreps tells me. “Projects where it’s like, ‘We’re going to put everything in the data lake, and figure out some way to get value out of that afterwards,’ the reality is it tends to be pretty hard to get that last bit of value.”
Instead, “It's really use case by use case, that’s a much more robust way to get to the same goal.”
His pitch to customers, he says, is to start simple, asking, “What's one application [of Kafka] in, for example, personalization, or operational efficiency — it’s different in any industry — that will save you money and make you more effective and make more people buy?” Once that is established, one can move on to “then the second use case, and the third,” and so on, he says.
In service of that goal, Confluent in October announced two new capabilities on top of Kafka, which have the dual purpose of bringing abroad new users and also differentiating Confluent’s version of Kafka from the freely available, plain-vanilla open source version.
One is “Stream Designer,” a tool that lets people more easily assemble Kafka connections without deep knowledge of Kafka.
“The downside of a new category is you’ve got to spend a lot of time explaining it to investors and everybody else,” says Kreps of the vagaries of Kafka. “But the upside is, there's a lot of white space around what we do, and we see that as an opportunity for us as well as for partners and others to help grow into.”
“There's companies where streaming is right on the edge of what they can do,” observes Kreps. “And so, step one is make those people more productive.”
“We're starting with the software engineers” inside companies, he says, “because that's our existing customer base, but, you know, the goal is really to democratize this whole idea of streaming and make it easy enough that anybody with basic technology skills can do it.”
What gets built with Stream Designer, he says, “is light and easy,” simpler kinds of Kafka functions. “There’s already people building deep code against Confluent,” he says, custom code, things such as stock exchanges, and, “big payment systems all built on Confluent, that are super-important, the beating heart of some businesses.” By contrast, Stream Designer is “all the other stuff around the edges that hooks that stuff up.”
The second program that Confluent rolled out is “Stream Governance Advanced.” It is like a Google Maps, says Kreps, to tell those working with Kafka, and those with whom they work, which systems and data Kafka is touching — the “data flow.”
“If we say ‘Hey, we want to open up use of data across the organization, we have all this infrastructure to make that great,’ that’s only half the problem, right?” explains Kreps.
“You want companies to use data in a way that's smart, that makes them more effective, but they need to do it in a way that's safe and secure and in compliance with the law,” is the spirit of the tool, he says.
Stream Governance Advanced had been “one of the most requested feature sets” among customers, he says. “This is just a gigantic headache for every company in the world,” meaning, to regulate access.
Are there other things like Stream Designer that he has in mind?
“Yeah, absolutely,” says Kreps. Kafka is the basic infrastructure, and on top of that, “when we look at our customers, there's dozens of these use cases that are quite general” that can be helped along, he says. “When we talk with customers, we’re always taking notes, thinking, Hey, you know, is there something more general that we could do to make these people’s lives easier?”
Those conversations result in “a long list of what we call up-the-stack use cases that are just one click from data streaming,” he says. “We look at that as an opportunity for us to come in and add a little bit more value.”
Because Kafka is so new, there are in fact acres of such opportunity. “The downside of a new category is you’ve got to spend a lot of time explaining it to investors and everybody else,” says Kreps of the vagaries of Kafka. “But the upside is, there's a lot of white space around what we do, and we see that as an opportunity for us as well as for partners and others to help grow into.”
As to what, specifically, those new opportunities are “up the stack,” Kreps demurs. “Obviously, I can’t pre-announce our roadmap,” he says with a chuckle.
What he will say is that, in general, “there is this emerging category of new technologies, not just Kafka and Confluent, but a whole set of things around streaming that plug into that in different ways; there’s a ton of enthusiasm for it.”
I can imagine there are a lot of young businesses out there, even with venture funding having dried up, that are already building such things, some of whom could be acquisition targets for Kreps. Those include programs built on top of Kafka, such as Apache Pinot, an indexing program that has been commercialized by the startup StarTreethat I profiled last year.
Kreps authored a short book published in 2014 by O’Reilly that describes Kafka as an example of a “log,” a technology for computer systems that serves as a sort of diary of events. It is an eminently readable introduction to the whole streaming idea.
Talking about future product is particularly interesting because Kreps is one of those CEOs who is “close to the metal,” as they say. Not only did he co-invent Kafka, and a bunch of other technologies; not only did he literally write the book on Kafka, a slim volume published in 2014 by O’Reilly that is a gentle introduction to the topic; in addition to all that, his current credentials include serving as interim chief product officer since last quarter, when Ganesh Srinivasan stepped down after four years.
“We’re actively looking for my successor” as product lead, says Kreps. Although it is “a little bit busy when I have two jobs,” he says, on the bright side, “whenever there’s a vacancy, that’s actually the one chance to really kind of get in there and, you know, get your hands dirty.”
Since I have his ear as the product lead, I decide to drill down into that new category of streaming things. Streaming is the technical term for when programs receive and act upon that diary of events that’s constantly coming out of Kafka.
If there is a new category of acting on such streams, does it, in some sense, replace other stuff, such as databases?
That is the automatic presumption of the Street, but Kreps has a somewhat more nuanced view. Yes, and no, Kafka and streaming are both accompanying and a bit displacing older data technologies.
“From the point of view of investors, there’s not that many new dollars” available for IT spending, Kreps concedes, “and so any dollar spent on this,” meaning, Kafka and Confluent, “is taken from that,” meaning, older stuff.
His lecture on the conference call earlier this month concerned the three main “estates” of data, those being bespoke, in-house apps, like the kinds big banks build for themselves; software-as-a-service apps, or SaaS, such as Salesforce; and data analytics programs, things such as Alteryx that let people perform analysis.
“All of those are incredibly important,” says Kreps. Kafka is a Fourth Estate that is placed beside them, he says; the old stuff doesn’t go away.
However, in the next breath, Kreps says that “there are a lot of things [in enterprises] that ought to be happening off the flow [of the business] that are instead some kind of batch process that happens at the end of the day.”
Much of the world of software has been developed as a bunch of stuff that gets dumped in a repository, such as an Oracle database, and only later worked on. That is what you call “batch,” because it is handled as a bunch of stuff, divorced from when the data was gathered.
The Kafka ethic is the opposite: handle stuff immediately, like the example I gave above of the welcome email for the new newsletter subscriber.
That whole new category of streaming to which Kreps refers is in some sense taking data in a different direction, the real-time direction.
"Most businesses aren't these, kind-of, batch, end-of-day processes,” he continues, “it's something much more continuous that's happening out in the world — and so, yeah, there's probably a bit of displacement as well.”
The displacement of the old might be something new called “stream processing.” That is the term for when programs are built from the ground-up to handle Kafka’s constant events. Imagine if all software were built only to do something as soon as an event happened, rather than waiting. One of the other technologies Kreps has helped build is an open-source program called Samza, an “engine” that plugs into programs so that the programs gain an ability to respond off of the events of Kafka. (Samza is another Kafka reference: Gregor Samsa, the protagonist of The Metamorphosis.)
To understand this new direction, think about using Twitter. Do you gather a bunch of your favorite tweets, put them in a box, and later on in the week, over a cup of tea, calmly reflect upon them, meditating on the best response? Probably not. You probably respond impulsively with a tweet as soon as a tweet somehow pricks you.
That’s the emerging divide in software, in a sense. The old way is waiting to reflectively act upon what’s been gathered or accumulated, while the new way, the way of Kafka and stream processing, is building a real-time network of apps centered on the diary, the hub, together forming a software blob that is continually acting rather than waiting.
“If you asked the average technologist, they would tell you, there’s limitations” to Kafka, says Kreps. “There’s a long list of things that won’t be possible” has been the argument of the naysayers. But those objections have steadily fallen by the wayside. “Efficiency, transactional correctness — all those have been checked off.”
That is, in fact, what Kreps accomplished with Kafka at LinkedIn: it turned LinkedIn from an entity that reviewed database material later on to one that responded in real time, in split-second fashion, constantly alerting users to new information.
“I think a lot of what was previously, kind-of, batch processing will move into real-time streams,” says Kreps, when I ask how far this divide can go. “You can, kind-of, view every business process as taking in some data streams and producing some other data streams.”
An example, he suggests, is claims processing in an insurance company. “You have a bunch of inputs to claims processing, and you have some output, such as, we’re going to approve it or we're going to deny it or we're going to send a human to go investigate in person,” he explains. “That's something that's moving into a much more real-time world” he suggests, “so that it can happen online, while you wait, where it used to be some batch process that was computerized, yes, but it took two weeks to get the output,” meaning, the answer.
If business is changing in that way, it suggests a far deeper role for Kafka. It is conceivable Kafka could, for example, become the new database for a lot of data processing in enterprises.
“If you asked the average technologist, they would tell you, there’s limitations” to Kafka, muses Kreps. “You can do this real-time stuff, but it’s not very efficient, you’re not going to be able to guarantee transactional correctness,” the safeguards of traditional databases. “There’s a long list of things that won’t be possible” has been the argument of the naysayers.
But, says Kreps, those objections have steadily fallen by the wayside. “Nobody believes anymore that it’s less efficient” he says of Kafka. “Nobody believes it can’t scale anymore, and nobody believes you can’t have the same transactional correctness” as a relational database from Oracle. “All those have been checked off.”
“And so, I think there's a very clear view of the database world where you say, Hey, we used to have these queries, and they would process the data, and when they get to the end of the [database] table, you stop,” says Kreps.
“But in the future, the ability for these [queries] to just, kind-of, keep working as the business evolves, and keep producing updated results all the time — that's actually a very powerful thing that's much more in line with how we use data now, and is going to have a very significant role.”
I like the thought of Kafka as The New Database, though it may take some time. To make an analogy with finance, traditional database technology is like a balance sheet: it tells you what things were like at a particular snapshot in time, usually the close of a three-month period.
In contrast, the cash flow statement is a flow or a stream, in the sense that it tells you, over three, six, nine or twelve months, how money has moved through the business. Cash flow tends to be my favorite financial statement, and I like the notion that Kafka might be a way of working with data that mirrors the cash flow rather than a single snapshot in time like the balance sheet.
Speaking of cash flow, it’s negative right now in Confluent’s case, which leads me to ask Kreps about his stated promise, repeated last month, that Confluent will achieve a positive non-GAAP operating margin at the end of 2024.
That’s a while to wait, and given investors’ enhanced appetite for profit these days, I ask Kreps why he hasn’t laid out a more detailed path to profitability.
“It’s a good question,” he allows. “It's obviously a time where every tech company is under philosophical scrutiny on efficiency,” he notes. “It wasn’t like we never thought about the growth and profitability trade-off,” he says. That trade-off has been contemplated since well before the IPO, he says. But, “we don’t put all that out there because it doesn’t make sense to have too many different targets at once,” he says.
“We gave that one [target] just because we felt like, Hey, it’s good to have a crisp timeline of how we’re building toward this,” meaning, profitability. “You can look at the business now, you can draw a line, and understand the progress we would expect over the next year or so.”
Perhaps the more pressing financial question right now is the company’s forecast this month which, like reported results, defied expectations.
Kreps and CFO Steffan Tomlinson were so bold as to project next year’s revenue higher than Street consensus, a range of $760 million to $770 million versus the average $763 million estimate. Predicting next year is something few companies have dared to do amidst conflicting economic indicators.
Where does he get the nerve to go there?
“We have a plan internally that's not final, but it is kind of shaping up,” he says. “We have a reasonable view of how the business is developing, and so, it’s reasonable, in a time of great uncertainty, not to wait another quarter: we might as well give people a little bit of a sketch of where we’re going.”
Perhaps you could think of it as real-time, streaming financial forecasting.
Confluent shares are down seventy percent this year, and down forty percent since IPO.
The secret to Analog Devices’s success
Nov 23, 2022
For chip companies, much of earnings season has been a let-down because of weakening markets such as personal computers. But one chip company is sailing through it.
Analog Devices on Tuesday morning reported its tenth quarter in a row of topping revenue expectations, and its thirteenth quarter in a row of topping consensus with its revenue outlook.
CEO Vincent Roche told analysts during the morning conference call that it had been a “record quarter” to top off “a banner year.”
The stock rose six percent on Tuesday, and was up again on Wednesday. Price targets are rising at numerous shops to over $200, which would be a gain of sixteen percent or more from a recent $172.97.
I was happy to see all that given that Analog Devices is one of the TL20list of stocks to consider owning. The shares, with this bounce, are now up nine percent since I inaugurated the TL20 in July.
Analysts, though, struggled Tuesday to understand just how this could be such a great time for Analog given what’s going on in the rest of the chip market.
Said Roche, “ADI, like the rest of the industry, is not immune to a softer macro environment and thus, we remain cautious, yet optimistic.”
That was not enough for the Street. “Are you surprised why your orders and bookings are holding up better, even though all the headlines we see from a macro perspective seem to be getting tougher?” asked Merrill Lynch’s Vivek Arya.
Analyst Ambrish Srivastava seconded the inquiry. “I think Vivek asked the right question,” he said. “Were you surprised? Is there a seasonality to it? I mean nobody doubts your positioning and how strong you are in your chosen markets.”
The response from both Roche, and from CFO Prashanth Mahendra-Rajah, was that Analog Devices is not exposed to the same markets and product categories as all those other companies.
Said Roche, "never have we been more diverse in terms of geographies, customer coverage, depth of coverage, depth of engagement,” adding that Analog has “product life cycles that stretch into the decades with very, very stable pricing.”
He was, in other words, making a case that Analog’s profile as a chip supplier is rather different. And that very much holds up if you look at what the company sells and to whom.
Unlike Intel and AMD and most other chip makers that focus on manipulating digital ones and zeros, Analog, as its name would suggest, has a very large portion of its product portfolio in what are call analog chips. These are chips that manipulate some kind of real-world signal, such as heat or sound or light, or electrical voltage. They either convert that signal to ones and zeros, for processing, or they directly manipulate the signal in real time, as a continuous variable.
This is why Analog Devices is rather unique, and why its current fortunes don’t align with the trouble everyone else is seeing.
As I wrote in a longish piece in 2021, it is the manipulation of those real-world signals that gives Analog Devices a tremendous breadth and depth and variety in the products and markets it supports. Over half the company’s revenue comes from what is called the “industrial” market, which is an amazing cornucopia of devices, things such as sensor chips that monitor factory equipment to detect levels of vibration (for faults or problem hints), or medical devices, where its chips are boosting the resolution of CT scans.
During Tuesday morning’s call, it was all those strange, unique markets that were, according to Roche, still surging even as markets such as smartphones and PCs cause problems for other chip makers.
Roche described a variety of “design wins,” when the company has been selected to have its chips built into a certain product. A piece of diagnostic equipment to “monitor machine health” at a “global supplier for energy exploration.” Chips for “high-voltage testers” of electric vehicles and renewable energy systems. Wireless transceiver chips going into 5G wireless network infrastructure. So-called “gigabit” communications chips that make for high-resolution displays in the cockpit of new cars.
Moreover, said Roche, the company has contracts that allow it to see years down the road for many product categories because they are not things like phones: they don’t change with the fashion every year. These are industrial products that are designed and assembled over many, many years.
For example, said Roche, “digital healthcare has been growing at the company in double digits for the last seven years or thereabouts,” in terms of revenue from health equipment like CT scans. “We expect to see that continue.”
And aerospace and defense markets, he said, are “likely to be a very brisk business,” said Roche. They’ve been “performing well for ADI now, and I believe, at least for the next five years, we will see stellar growth in that area.” The company’s chips for "energy and sustainability businesses are also beginning to really go on the uptick.”
EVs, said Roche, are a particular area of focus that’s paying off. “We're getting a very strong tailwind from the electrification of the vehicle, in fact, we're gaining a lot of share in general, I think, with in-cabin and the electric vehicle,” he said.
At the same time, Analog Devices is defined by what it is not. It sells chips into consumer electronics markets, which make up thirteen percent of the company’s revenue. However, said CFO Mahendra-Rajah, a third of that revenue "is derived from long- life-cycle prosumer applications, including next-gen conferencing systems, professional AV and home theater,” things not necessarily as volatile as smartphones, in other words. The rest of the consumer revenue, he said, the other two thirds, “relates to the faster-growing wearables and hearables as well as premium smartphones.”
That latter two-thirds is “cyclical,” meaning, it also succumbs to economic trends, said Roche. But while he didn’t quantify the impact to such consumer chips, Roche pointed out that “our Consumer business continued to grow despite industry-wide weakness.”
The bottom line, then, for Analog Devices is having a better profile to its choice of products and markets, things that are part of building complex systems, such as factories and wireless infrastructure, and electric vehicles, and which don’t suddenly stop when economic times get rough.
I would note, too, one other thing that can easily be missed. Analog Devices’s revenue for the year ended last month was twelve billion dollars. The entire semiconductor market in 2021 was worth over half a trillion dollars, accord to the industry consortium, The World Semiconductor Trade Statistics.
What that means is that Analog Devices is equivalent to about two percent of the market’s total sales value in any given year. And so, the company simply isn’t exposed to the market to the same degree as, say Intel, with $64 billion in annual sales, or Qualcomm, with $40 billion in annual sales.
Dell shocker: Estimates are going way down for 2023
Nov 22, 2022
You can’t see body language on a conference call, but I’d imagine the body language was squirmy on Dell Technologies’s call with analysts Monday evening to discuss the company’s fiscal third quarter and its outlook.
The reported results topped expectations, but just narrowly on the top line. The company’s forecast for this current quarter’s revenue was off by a billion and a half dollars relative to the Street, the second forecast miss in a row.
But the squirmy part came when the outlook for next year was discussed, like something bad sitting at the back of the fridge that no one really wants to look into.
The backdrop is that sales of personal computers continue to fall apart, especially consumer PCs, as has been the case all year long. Sales of server computers and the attendant corporate infrastructure — networking and storage — are the bright spot, but even there, growth turned out weaker last quarter than the company had expected going into the quarter.
Server sales are, obviously, starting to be a victim of corporate customers starting to rein in purchasing. IT is tightening its belt.
When CFO Thomas Sweet got to talking about 2023, or what Dell considers fiscal 2024, ending in January of 2024, he said all the same problem issues will be there, including “ongoing global macroeconomic factors, including slowing economic growth, inflation, rising interest rates and currency pressure.”
And he added, “these dynamics are creating a broader range of financial outcomes for our upcoming fiscal year, particularly as we think about the second half of the year,” emphasis my own.
Now, a “broader range” is Street code for uncertainty, and Sweet tried to help by adding, “With what we know today, it's likely next year's revenue is below historical sequential, using our Q4 guidance as a starting point.”
That was not enough for analysts, and so Sweet was challenged by David Vogt of UBS, who asked, “Can you, kind-of, elaborate on your earlier remarks about the framework for 2024?”
Sweet replied, “I don't want to get into exactly what next year looks like because we're still working our way through it.” But then, he offered a formula:
If you took sort of the midpoint of our guide and then ran normal historical sequentials, say, over a couple of – two-year historicals and maybe haircut those a bit, I think you're going to be in the ballpark of what our current thinking is, recognizing that it's going to continue to evolve and change over the coming months.
Well, if one does a back-of-the-envelope, using historical quarterly growth for Dell, what you come up with is a forecast for next year of about $93 billion dollars, which is five billion dollars below the current consensus of $98 billion.
Here, I’ve put it in a table. The table uses the average ratio of one quarter to the next over the past five years, so, what Sweet calls the “normal historical sequentials,” and extrapolates from the January quarter forecast given this evening.
Some analysts are going to be cutting numbers even more deeply — giving a really big haircut. Aaron Rakers with Wells Fargo, noting that “Dell's F4Q23 guide and directional F2024 comments will be considered negative,” cuts his revenue forecast to $88.7 billion, almost ten billion dollars below consensus, a drop in revenue next year of eleven and a half percent.
I know it sounds contradictory, but this warning from Dell tonight seems to me one of the first concrete reads on uncertainty. The uncertainty is palpable for many companies, and Dell has just given a shape to it, for what it’s worth.
Dell shares declined in late trading by about two percent to $40.21. The stock is down twenty-eight percent this year.
The TL Podcast for November 20th: The chips have it and what comes after cloud?
Nov 21, 2022
A rebound in chip stocks that’s quite interesting, including TL20 names such as Taiwan Semi (TSM), some software makers such as Alteryx argue they’re recession-resistant, and it may be time to think about what could happen to cloud computingafter a recession.
Nutanix CEO: A time of ‘inflection points’
Nov 21, 2022
An economic contraction can often correspond to big changes in the technology landscape. Think about the 2008 to 2009 recession, The Great Recession. Amazon’s cloud computing service, AWS, was born the year before the recession, and after the recession, IT became all about cloud computing.
Apple introduced the iPhone in 2007, and from 2008 onward, Apple stopped being a computer company and became The iPhone Company.
“The 2000 crash, and then the 2008, each of those were, I think, inflection points for tech companies,” says Rajiv Ramaswami, CEO of software maker Nutanix.
“I think this is likely to be one of them,” he says of an as-of-yet undeclared economic contraction.
The U.S. has seen two quarters in a row this year of declining GDP, year on year, which is one of the criteria for a recession. But the National Bureau of Economic Research, which holds the responsibility in the U.S. of declaring recessions, hasn’t declared one. If and when it does, it may be long after the fact, say, a year from now.
Ramaswami was not making an economic prediction. He was taking up my question, What happens to tech if we have a recession?
We were talking this past week at Nutanix’s satellite office in New York, in midtown Manhattan, as Ramaswami swung through town on customer visits. The company is headquartered in San Jose in Silicon Valley.
“It's great to be out with customers again,” he says. “We’ve been in this environment where we've had remote interactions for a long time, for two years now, and it's time for us to get back out on the road,” he explains. “I don’t think there’s a substitute for that” in virtual meetings.
What happens in the inflection point? One thing that happens is a “reckoning,” says Ramaswami, for all the startup companies. “They’re running out of money, and they’re going to have to hunker down, and some of the good ones will raise the cash to see them through and to be still relevant after we come out of this.”
I think that’s a conclusion that’s hard to deny. But more broadly, I’m wondering, what happens with the big changes, the thematic shifts that affect whole industries, as with cloud’s rise from 2008 onward?
There will be a more “nuanced” approach to cloud computing, Ramaswami tells me. “Historically, people talked about moving [their IT operations] to the cloud as a destination, to go to Azure, AWS, etc,” says Ramaswami. “What they’re really talking about is they like the attributes of what a cloud is, the automation and the services that are available.”
This is a bread-and-butter issue for Nutanix, which makes cloud-like software. Nutanix’s programs can “virtualize” a company’s computing and storage and networking so that in the company’s own data center, things run more like cloud computing, meaning, with more efficient use of shared resources, and an ability to work around computer failure.
As important, the software can be used as a form of transit, to move a company’s programs, once virtualized, to the public cloud facilities of Microsoft and the rest, and back again. In other words, it’s a kind of shuttle system back and forth, depending on what companies want at any moment in time.
His prediction is that as customers try to work more and more in disparate locations, and with a mix of what’s in their data centers and what’s in public cloud facilities, they will use software such as his in order to perform computing in a variety of places, including at a retail store locations or other parts of the “edge,” far away from traditional data centers.
“They’ll think about cloud as an operating model, rather than a destination,” he says. “My apps and data are going to be everywhere,” he says of the typical enterprise. “When I say cloud is an operating model, they’d like to operate wherever their stuff sits, apps and data, they want to be able to get that cloud experience.”
If there is a recession around the corner, Ramaswami believes the most immediate choice about cloud as an operating model will be how to do it more economically, to save on expenses. Belt-tightening is already happening.
Microsoft’s CEO, Satya Nadella, and Google’s CEO, Sundar Pichai, have both told the Street this earnings season that customers are using less of the cloud, as they try to rein in costs.
Part of the nuance Ramaswami sees coming down the road is companies pruning what they put in cloud computing environments. “It actually works to our favor,” he says of the comments by Microsoft and Google, “because everybody's looking at costs in a significant way, and especially public cloud costs,” says Ramaswami.
“There’s much more of an inclination to be careful about what you put in the cloud, and when you put it there, and what you keep on-prem,” meaning, locally, in the data center. Those choices, increasingly meaningful in a budget-constrained period, will benefit from the Nutanix shuttle system, to pick and choose.
“They've got to be much more concerned about making it [applications] portable, so that they can have the freedom of moving back and forth.”
“There’s much more of an inclination to be careful about what you put in the cloud, and when you put that there, and what you keep on-prem,” says Ramaswami of budget constraints in the data center. Those choices, increasingly meaningful in recession, are a positive for the software he’s selling, he says.
A more profound change, down the road, long after a recession, is that companies are liable to move between different public clouds. They could run an app in Microsoft’s Azure in the morning, and move that same app to Google’s GCP in the evening. That hasn’t happened yet.
“Right now, every one of our customers is saying they want to run some stuff on-prem, and then have more than one public cloud provider” as a form of arbitrage, he says. Meaning, an ability to pick, case by case, where they’ll get the best deal. However, “We haven’t seen a lot of movement from one to the other,” he says, meaning, that morning-to-night example of moving an app from Azure to GCP or AWS or another provider.
What does it take for that to happen? “They’ll have to start building apps that are portable,” says Ramaswami, referring to his customers. “Historically, people have gone to the cloud because it’s that easy button to push,” he says, the “one-click” experience of Amazon AWS and the others. Just slap down a credit card and get going. That easy button, however, also weds a company to Amazon or Microsoft or Google’s services, he says.
The Nutanix software is only part of the puzzle. “The question is what about all the other services that the app needs,” he says, including a database to draw from; a form of data “caching”; a form of search; and a way to pass messages between applications.
“The real opportunity is for people to say, I’m going to build with open-source tooling,” he contends. Only open-source code is likely to provide the portability that is needed to free customers.
“The good news is, there are a lot of open-source options available” for all those functions. “In fact, ironically, a lot of the public cloud is just proprietary implementations of open-source,” observes Ramaswami. That’s true: Apache Spark, to take just one important example, is a crucial data management program that is broadly available for download, but then Amazon has its own version of Spark.
Ramaswami’s goal, over many years, is for Nutanix to sell such open-source tools. “If you look at our own role in that today, one of the things we are doing is to say, We provide a set of tooling on open source databases,” known as “database-as-a-service.”
“And we are in the process of making that a common platform that you can use everywhere.” Nutanix, he says, would like to do the same with all the other pieces people need from one cloud to another. “We have the same aspirations to do that with all these other things” such as messaging and search, “a little bit at a time.”
The same aspiration, he says, is taking shape at companies that are younger, such as Confluent and privately-held Databricks. They all, like Nutanix, have a vision of things spanning from one cloud to another, without a “lock-in.” It’s what I’ve referred to in past as “trans-cloud,” software that is not beholden to one particular cloud computing service provider.
Nutanix, says Ramaswami, is among a cohort of companies with the same vested interest in helping their customers escape the lock-in.
“If I’m Confluent, it's in Confluent’s best interest to make that available not just in AWS, but also in Azure, right? And everywhere,” says Ramaswami. The strategy has already been in operation, he notes, at Red Hat, the division of IBM that sells a virtualization tool, called OpenShift. “OpenShift is available across all the clouds, that is their explicit strategy, to create a consistent platform at a level that works across multiple clouds.”
The same could be said for Snowflake, the database company that operates in whatever public cloud a customer wants.
These things may take years yet, Ramaswami realizes, to coalesce, although sometimes, economic shifts have a way of propelling such change. Could trans- cloud, a more nimble transit between many clouds, be of benefit to Nutanix?
“Yes, absolutely,” he says. “We are clearly there at the infrastructure level today,” meaning, the virtualization that shuttles workloads around. “We are trying to go up the stack with database as-a-service,” and to other applications.
“Again, the philosophy of making that available everywhere, to make the simplicity, flexibility of choice and portability available across [clouds] — If you're going to continue doing this path” of spanning different clouds, he says, “it's going to help customers, it’s going to help us.”
For the moment, in the waning months of 2022, the Street is of course fixated not on sea-change but on what might happen to revenue in a recession. Nutanix is currently in a quiet period, Ramaswami emphasizes, its fiscal first quarter having just ended last month. The earnings report is due out November 30th. Hence, Ramaswami is not making any new financial declarations, he notes, merely reiterating what he has said to the Street on the last conference call.
Ramaswami has already told analysts a good chunk of revenue is accounted for in 2023 by software “renewals,” a point he and I talked about last month.
As devil’s advocate, I press him on that. Is there a chance renewals could be hampered in a recession scenario, because they could represent too large a line item for some customers?
As long as his customers are actually using his software — “and we think most of the customers are,” he says — then, he contends, “they are typically running their enterprise on it, they are running their mission-critical workloads on it, and it's not an optional thing for them.” Not easy to not renew, in other words.
What is up for grabs, or up in the air, is new business from prospects. For those companies, there is the vague possibility that belt-tightening, as mentioned above, may bring companies to Nutanix for help. For the moment, he says simply, “It’s new business where there's a lot of uncertainty, and we’ve been prudent enough” in setting a cautious tone with the Street.
In the same breath, Ramaswami reminds me that his biggest competitor, VMware, the company that started the virtualization technology phenomenon, is being bought by Broadcom. That will very likely be a source of new customers for Nutanix, he tells me, because in a merger scenario, customers start to worry about what’s happening to their vendor.
“It's coming up in every conversation I have had with customers,” he tells me of the merger. “They're all concerned.”
It’s not just uncertainty, he says. Broadcom, as an acquisition vehicle under CEO Hock Tan, has developed a certain reputation. “Some of the customers who were customers of CA and Symantec” two large software makers bought by Tan, “have seen what has happened,” he says. “It was not a great experience for them,” he says, without elaborating.
The prospect, he says, is the prospect that “they’re [Broadcom] going to try to lock-in” customers to a VMware experience that closes the customers’ options, precisely the opposite of a more flexible, nuanced cloud experience. “I mean, that’s the Broadcom mindset.”
It will “take time” for Nutanix to see the benefits of fleeing VMware customers, he says. It takes patience to have conversations and develop relationships. “We say, we’re here to help you,” is the gentle sales pitch. The nice news is that a lot of those prospects “are inbound calls, where we weren’t engaging before” — potentially a whole new customer cohort, in other words.
From an investor standpoint, Nutanix is having its own inflection point irrespective of the economy. This year just ended was the first year the company achieved positive free cash flow since the company switched to a subscription business model from traditional license sales, roughly four years ago.
That is one achievement on a “path,” says Ramaswami, to maintaining “profitable growth,” made somewhat more relevant by economic uncertainty.
“It’s very clear investors are rewarding companies that are profitable,” he observes. “We have been on this path for two years, but it so happens that recession is now adding a rationale for why we should be doing this.”
“We are committed to going down this path to eventually getting to Rule of 40,” he says, using the Street jargon for when a company has a combination of revenue growth and profit margin that equal 40 when combined.
The Street predicts Nutanix this fiscal year increasing sales by just under thirteen percent, for a total of $1.78 billion, and generating an Ebitda margin of about six percent.
Another thing, I observe, that happens in a recession, is M&A. Prices of some assets get so low, acquirers get more aggressive. Ramaswami cannot comment on any possible M&A, he says, but offers he is “flattered” by “all the attention we’re getting.” He is referring to a story by The Wall Street Journal’s Dana Cimilluca and Cara Lombardolast month headlined, “Nutanix explores sale after receiving takeover interest.”
“The more attention the better, the more people that write about us, the better for us,” he says. That is true, given that for much of the public, Nutanix’s software is somewhat mysterious.
As for any deal talk, “It’s not for me to comment on these things,” he says. “It’s for us to focus on building our business, which is what we are doing.”
Nutanix shares are down just over ten percent this year, trouncing the Nasdaq Composite Index’s twenty-nine percent decline.
FTX disaster: Guy who liquidated Enron has never seen a mess this bad
Nov 18, 2022
An empire run like a teen with secrets to keep: “Mr. Bankman-Fried often communicated by using applications that were set to auto-delete after a short period of time, and encouraged employees to do the same.”
It’s no fun to pile on to Monday-morning quarterbacking disasters, but then every once in a while, a document comes over the transom that is so delicious, it’s hard to resist piling on.
FTX is a crypto-currency exchange that was founded in 2019 by Sam Bankman-Fried and a couple of young friends. It had been, up until a couple weeks ago, perceived as a pillar of the crypto world, if that means anything. It is now in Chapter 11 bankruptcy proceedings, having lost billions in clients’ money.
The vague story leading up to Thursday was that the company had nowhere near the liquid assets people thought it did, and so, no way to safeguard the billions in deposits that FTX’s customers had placed with the company. It appears a hedge fund inside of FTX was secretly taking funds from those depositors and using them to trade — at least, that’s been the surmise of CNBC and other sources to date.
Thursday came the filing in bankruptcy court of a thirty-page document from the person who has taken over FTX to liquidate it, John J. Ray III, who is a career restructuring expert.
Ray presided over the liquidation of the notorious energy failure Enron, among others. Given the amount of malfeasance Ray has seen in his career, it’s quite something to read what he had to say in his dossier.
“Never in my career have I seen such a complete failure of corporate controls and such a complete absence of trustworthy financial information as occurred here,” writes Ray of FTX, after a week going through what little there is of the books.
“From compromised systems integrity and faulty regulatory oversight abroad, to the concentration of control in the hands of a very small group of inexperienced, unsophisticated and potentially compromised individuals, this situation is unprecedented.”
Among the failures Ray describes,
“the absence of an accurate list of bank accounts” — it wasn’t even clear where the company’s cash resided;
Employees bought stuff on the company tab: “In the Bahamas, I understand that corporate funds of the FTX Group were used to purchase homes and other personal items for employees and advisors”;
An audit firm that sounds nuts: “Prager Metis, a firm with which I am not familiar and whose website indicates that they are the ‘first-ever CPA firm to officially open its Metaverse headquarters in the metaverse platform Decentraland’”;
An HR procedure that mixed together employee and contractor records, “with unclear records and lines of responsibility”;
Managing payments worse than a lemonade stand: “employees of the FTX Group submitted payment requests through an on-line ‘chat’ platform where a disparate group of supervisors approved disbursements by responding with personalized emojis”;
“did not keep appropriate books and records, or security controls, with respect to its digital assets” — custodian with no idea of custody;
Managed records like it was Snapchat: “One of the most pervasive failures of the FTX.com business in particular is the absence of lasting records of decision-making. Mr. Bankman-Fried often communicated by using applications that were set to auto-delete after a short period of time, and encouraged employees to do the same.”
There are multiple investigations underway of the whole business, including an SEC investigation and a criminal investigation in the Bahamas, where FTX was domiciled and where Bankman-Fried was apparently residing.
If all this is as bad as it seems, then to my mind, it supports what I wrote over the summer, which is that certain foundational promises of crypto have been broken.
Crypto, it turns out, is not decentralized as its mythology would imply; it’s in the hands of massive exchanges such as FTX and other parties that dominate activity including Binance.
And yet, its centralization has not meant protection for investors, in fact, just the opposite. Crypto is like a throwback to the Great Depression, when there was minimal oversight of banking and depositors were abused on a regular basis without recourse.
Crypto is, in a sense, the worst of both worlds: the manipulation of centralizing forces, but with all the disorganization and lack of security of the Wild West.
Applied Materials rising as quarter turns out much better than feared
Nov 17, 2022
Update:
It was a very upbeat conference call this evening between CEO Dickerson, CFO Brice Hill, and analysts.
The outperformance the company displayed in the headline results, relative to its warning in October, was a result of two things. One, the company’s hit from U.S. sanctions against China turned out to be less than expected initially, a decrease of $280 million rather than the $400 million that had been forecast.
Second, said Hill, the company’s “execution in the end of the quarter was almost flawless form a logistics perspective.” Applied, he said, “Got more supply chain parts in at the end of the quarter” that helped boost revenue by a couple hundred million dollars.
Both Dickerson and Hill emphasized that the company has a record amount of backlog, meaning, parts that have been ordered that it hasn’t been able to deliver in a timely manner. While the Street has focused on a slowdown in chips, the story for Applied continues to be supply-chain issues that have held back shipments of equipment.
“We are still supply chain limited across a number of key product lines,” said Dickerson, although, he added, “we expect to continue closing supply gaps over the next few quarters.”
As for that backlog of orders, it was up sixty-two percent, a total of nineteen billion dollars. Nineteen billion dollars is, I would note, equivalent to seventy-three percent of all of last year’s revenue. So, you could think of it as Applied has almost a year’s worth of revenue “in the bag,” so to speak.
Dickerson talked about various markets, and on balance, what he had to say was positive. Yes, there is “weakness in consumer electronics and PCs,” and that will continue to be a weak spot for the chip market into 2023, he said. On the other hand, “automotive, industrial, and power markets remain robust.”
2023 will be a “down year” for equipment sales for the whole industry. And Applied may see its revenue diminished to the tune of two and a half billion dollars, he said, because of the continued sanctions on sales to China.
But, said Dickerson, “we believe that Applied's business will be more resilient, thanks to our large backlog, growing service business, and strong customer demand for our leadership products that enable key technology inflections.”
Moreover, said Dickerson, chip complexity keeps rising on the path to one trillion dollars in chip sales come 2030.
“As technology complexity is increasing, we expect equipment intensity to remain at today's levels or rise further,” he said. “This means wafer fab equipment is likely to grow faster than the overall semiconductor market.”
Previously:
Chip equipment giant Applied Materials this afternoon reported fiscal fourth quarter results and outlook comfortably ahead of consensus, and better than a warning it offered in mid-October.
It was the strongest quarterly showing since August of last year, as the company deals with the global economy hitting demand for chips and thus, chip equipment.
The company’s revenue and profit of $6.75 billion and $2.03 per share was higher than consensus of $6.44 billion and $1.75.
Most interesting, the final revenue number and profit number this afternoon are well above a revised forecast for $6.4 billion, plus or minus 250 million, and $1.54 to $1.78, excluding some costs, that the company had offered in mid-October when it warned that new regulations on sales to China would hamper its results. It would appear things turned out much better than feared.
The revenue beat, and the forecast revenue, are both five percent higher than consensus, the best showing that I can see going back to August of 2021.
Gary Dickerson, Applied’s CEO, said the company is dealing with the global economic and geopolitical situation, and will reduce some of its spending, while nevertheless sounding a chipper tone.
Said Dickerson,
Applied Materials delivered a strong finish to our fiscal year with record performance, and we remain focused on mitigating supply chain constraints and doing everything possible to meet customer demand. Though we are slowing the rate of spending growth in the near term amid geopolitical and macroeconomic challenges, we are making the strategic investments to win the major technology inflections that will enable Applied to outgrow the semiconductor market.
Applied will hold a conference call at 4:30, Eastern, and you can catch the webcast of it on the company’s investor relations Web page.
Applied shares are up two percent in late trading. Shares of fellow chip equipment vendors Lam Research and ASML are also rising.
Applied is a member of the TL20group of stocks to consider. With the small rise after-hours, the shares are up thirteen percent since the inauguration of the TL20.
Nvidia CEO Huang: Cloud expands the company’s reach into enterprises
Nov 17, 2022
Huang says selling his company’s full plate of hardware and software, “the stack,” in public cloud facilities such as Microsoft’s Azure, “is just so much more coherent” as a way to sell to enterprises. It’s conceivable the deal also opens up many more prospective customers, thus expanding Nvidia’s total addressable market.
Following a report this evening of better-than-expected quarterly revenue, and an in-line outlook for this quarter, Nvidia’s CEO Jensen Huang was kind enough to take a moment to talk with me by phone. I told him that Nvidia is one of the inaugural picks in the TL20list of great companies to consider owning. “Thank you so much” was Huang’s reply.
Huang can be a person of few words in some interviews. When I asked him what is most important from tonight’s results and outlook, he replied, “We are guiding a better quarter next quarter than this.” What he was referring to was that the company says it is getting its arms around a situation of over-supply of chips for video gaming that caused the company in August to cut its outlook. “We have quickly taken care of our inventory, corrected for our inventory,” he told me.
Huang then recounted the product highlights that he’d also talked up with the Street on tonight’s call:
We have multiple products in the early ramps that are that are home runs. Hopper. Transformer engine. The Ada generation of GPUs — off the charts. Orin has our Drive, our autonomous vehicle platform, to make the auto business into our next multibillion-dollar business. Great stuff going on.
The “Hopper” chip is the latest Nvidia GPU being used for artificial intelligence, which is just coming to market and which racked up impressive test results for AI tasks this month. Huang sees connected vehicles with the Nvidia “Orin” chip as being the next big market for the company after video games and AI.
Huang talked a lot on the call about the cloud service providers, or “CSPs,” including Microsoft. Nvidia had already announced Wednesday morning a deal with Microsoft to offer what Nvidia refers to as its “full stack.” That is jargon for adding to the chips that Microsoft already uses from Nvidia with “tens of thousands” more GPUs, as well as software and chips dedicated to networking together computer systems.
The deal has echoes of the big win that Nvidia had earlier this year with Meta, the owner of Facebook, to buy tons of GPUs for the Research SuperCluster Meta is building for AI. The deal also brings Nvidia’s software to Azure, called “AI Enterprise.” The software acts like a bag of apps for companies, to ease their ability to put together AI into something usable.
I asked Huang what the significance is for his company. Huang can sometimes be frustratingly “on message,” and his response to me was similar to what he said on the call to the analysts.
“We, as you know, always have sold GPUs to CSPs, but CSPs have become two parts,” said Huang. “One is internal, and secondarily, public clouds.” He’s referring to the fact that Microsoft both uses Nvidia chips to develop cloud products and services, and directly rents Nvidia chips to Azure customers who want to use them.
Huang went on to say that Microsoft is going to be “a cheerleader for us” when it comes to pitching the Nvidia chips and software to enterprises, in addition to Microsoft using the chips to run their own AI offerings.
Huang expects that enterprises will increasingly use of AI by renting it from public cloud services. “It’s very clear now that we are at the tipping point of every enterprise company being cloud-first,” as he put it. Basically, that’s because AI programs are so complex, it’s just too expensive and too complicated for most companies to try and do it themselves in their own computer facilities.
That development with cloud is good for Nvidia for two reasons. Deals like the Microsoft deal mean that Nvidia gets a big channel by which to sell its stuff to enterprises. And secondly, reading between the lines, I would deduce that such a deal also means Microsoft may be cooling off on some of its own work on custom chips that they had been pursuing in recent years to try and be self-sufficient. They may be deciding it’s better to just keep buying from Nvidia. If so, it’s a massive win for Nvidia.
What Huang didn’t say, and he also didn’t say when asked the question by CJ Muse of Evercore on the call, is whether selling software will change the financial model of Nvidia’s business, as I had mused in September.
I pointed out to Huang that in the past decade, his company has gone from being what I had considered the scrappy challenger in the data center, trying to unseat Intel, to now being the dominant firm among all chip vendors in the data center, the company in control of the workloads that matter, AI.
How, I asked, does that change in the profile of Nvidia change the kinds of opportunities that Nvidia pursues, or the challenges the company faces?
The question was a philosophical one, but Huang can sometimes be frustratingly evasive when it comes to answering long-winded business questions. In this case, he punted and merely went back to the matter of selling the Nvidia stack in the cloud.
Selling the stack, he said, will make things easier for his customers to use his technology wherever and whenever:
Our ecosystem, the end user, the end markets, the end vertical markets, would be the same. We've always called on, we've always engaged, in part, the end markets. And now we have a coherent, if you will, an organized way of going to the end markets both through cloud and OEMs. And as a result, one architecture and video AI runs on prem as well as in cloud, one full stack. And this way of serving customers is just so much more coherent. And by using the Nvidia stack, they could basically run everywhere. They could run on any OEM server, they can run in any cloud.
What I would take away is that, again, Nvidia expects that Microsoft and the other cloud computing firms, Amazon, Google, Oracle, are going to be a much bigger channel for Nvidia to sell indirectly to enterprises.
Is that a meaningful development for Nvidia? Yes, I think it can be. It can mean that Nvidia, a company that has been selling gear too expensive for many enterprises, now may have a way to price and bundle offerings in the public cloud that will bring its hardware and software within reach of more companies. In other words, it can expand the total addressable market for Nvidia, something all companies love to do.
Nvidia shares are up two percent in late trading. The stock is up almost two percent since I picked it in July for the TL20.
Nvidia’s forecast in-line with Street, says ‘quickly adapting’ to global economic slowdown
Nov 16, 2022
Artificial intelligence chip titan Nvidiathis afternoon reported fiscal third quarter revenue that topped analysts’ expectations, but missed on the bottom line, and forecast this quarter’s revenue a tad light of consensus.
The report follows Nvidia having cut its expectations in August because of rising inventory of GPU chips because of slowing video game chip sales as a result of the weakening global economy.
In prepared remarks, co-founder and CEO Jensen Huang told the Street, “We are quickly adapting to the macro environment, correcting inventory levels and paving the way for new products.”
Huang made a number of upbeat remarks about the company’s latest products and markets:
The ramp of our new platforms ― Ada Lovelace RTX graphics, Hopper AI computing, BlueField and Quantum networking, Orin for autonomous vehicles and robotics, and Omniverse ― is off to a great start and forms the foundation of our next phase of growth. NVIDIA’s pioneering work in accelerated computing is more vital than ever. Limited by physics, general purpose computing has slowed to a crawl, just as AI demands more computing. Accelerated computing lets companies achieve orders-of-magnitude increases in productivity while saving money and the environment.
Revenue in the three months ended in October was $5.93 billion, above the company’s own forecast for $5.78 billion to $6.018 billion. Analysts had been modeling $5.78 billion. Profit of 58 cents a share, excluding some costs, was below the average 71-cent estimate.
Sales of chips for the gaming market plunged by fifty-one percent from the prior-year period and by twenty-three percent from the second quarter.
Sales for the data center, including AI, rose by a healthy thirty-one percent, though that was slower than the sixty-one percent in the prior quarter and the eighty-three percent in the quarter before that.
The forecast for this quarter’s revenue is six billion dollars, plus or minus two percents which is just slightly below consensus for $6.074 billion.
Nvidia will hold a conference call with analysts starting at 5 pm, Eastern time. I’ll be interviewing Huang later this evening, so be sure to check back for that.
Nvidia shares rose two percent in late trading to $163.
Also this afternoon, networking giant Cisco Systems beat with its fiscal first-quarter revenue and profit, and raised its outlook for the full year’s revenue and profit above consensus.
Cisco shares jumped five percent in late trading.
Alteryx CFO: In a downturn, customers may need us even more
Nov 16, 2022
The modern business of selling software programs is not merely a matter of making a good pitch to prospective buyers. Because the software contracts are renewed every year, or two or three, a vendor must make sure not to lose the customers they’ve already got.
“We really do have a zealot user community,” Kevin Rubin, chief financial officer of software maker Alteryx,says of his customers.
As proof of that, during the company’s annual “analyst day” meeting with the Street in May, he recalls, a panel of Alteryx customers were asked, What would you do if your company chose to move away from Alteryx?”
“The response of one of them,” says Rubin, “was along the lines, ‘You’d have to pry it from my cold, dead hands.’”
Rubin shared that anecdote with me last week over a meeting on Zoom following Alteryx’s third-quarter report November 1st. Maybe you think the hyperbole is overly dramatic for a software tool, but it fairly represents, says Rubin, how Alteryx’s value to customers hasn’t evaporated as the economic uncertainty has arisen.
“I don't want to leave anybody with the impression that we're immune to the macro effects, but I do believe we provide significant value to customers and we'll continue to convey that,” he tells me.
While not immune, perhaps, Alteryx is faring way better this year than a lot of other vendors.
The third quarter was the fourth quarter in a row of revenue upside. And while some firms are trimming their outlook because selling is getting harder, Alteryx raised its forecast for the year for the third time in a row.
At the heart of this delightful streak is the fact that a lot of modern software selling is the phenomenon of renewals, when customers who bought a license to the software years prior have to pay up again to keep using it.
The Street loves renewals because if you can get paid again by the same customer in a periodic fashion, it’s like a guarantee of future revenue, and, sometimes, with little or no incremental cost.
Rubin told the Street on the conference call that renewals of the company’s software were “robust” in the third quarter, without disclosing a quantitative amount. In our exchange, Rubin observes that even in a bad economy, there are things that keep customers coming back. It’s not just that they’re zealots, it’s that the software becomes enmeshed in how companies function.
Alteryx makes data analytics software programs that are meant to “democratize” use of data in an organization. Rather than be stuck with a lame Excel spreadsheet, or, conversely, needing to have a PhD in data science, a person who uses the Alteryx software is supposed to be the everywoman, or man, in an organization, a citizen analyst who can derive meaningful insights with less toil. They can use the programs to get a read on all kinds of trends in how the business is performing.
When customers deploy Alteryx software, Rubin points out, they’re often replacing heavily manual activity like the Excel spreadsheet by “automating that work on a server deep in their IT environments.”
“That becomes incredibly valuable and sticky for those organizations,” he tells me.
Renewals are the latest test of the turnaround engineered by CEO Mark Anderson since he came aboard in October of 2020. Anderson revamped the way the company sells, focusing on the biggest of the world’s companies, and creating a sales culture in which Alteryx reps do a lot more hand-holding and after-sales consulting, to get away from one-shot deals and get larger purchases.
The first test of that approach was whether it would re-accelerate Alteryx’s flagging sales, and it has. Revenue is expected to rise fifty-nine percent this year, to $830 million, after just eight percent growth last year.
The company last quarter had its biggest deal with a new customer in its twenty-five-year history, says Rubin, without disclosing the customer name or the dollar amount.
“The penetration that we currently enjoy in the customers we do have is still incredibly low,” says Rubin, given how many can potentially benefit from the tools. “So, there is a lot of opportunity to continue to grow this business within the existing installed base for many years to come.”
Having reinvigorated the sales effort, the next test is whether those companies would keep renewing even in tough economic times. That’s the test now playing out, and it appears to be going very well.
“Some of the changes that we made were how we support and manage customers, post-deployment, and in particular how we renew customers,” says Rubin. Chief revenue officer Paula Hansen, who came aboard last year, says Rubin, has been instrumental in that effort to stick close to customers, rather than to sell them something and then go on to the next prospect.
“We’ve invested heavily in customer success,” says Rubin. “That organization's sole purpose is to ensure that the most important companies in the world are getting, call it, white glove treatment around what they've purchased, how they're deploying, and then opportunities for future deployments and future ROI with Alteryx.”
The intention of that hand-holding is to make sure that the questions become, says Rubin, “How do I use more of Alteryx and less of other things,” rather than, “How do I reduce my spend on Alteryx?”
In fact, the urgency of using the software should increase, if anything, in a time of stress, says Rubin.
“Organizations know data doesn't get less just because the business environment is less” in a recession, says Rubin. "You know, if you're a company that's going through hardship, you're not asking yourself fewer questions.
“It becomes even more important that organizations have an intense focus on the data around them to make decisions.”
Alteryx’s goal since Anderson shook up the company is to court the Global 2,000 biggest firms, of which Alteryx has about half as customers now. “So there's another half that we believe to be Alteryx customers over time, and then there’s several thousand others that we believe should also be our customers.”
And every one of those new customers and existing customers can be mined to sell even more software by spreading use of the program throughout an organization. “You can’t teach a data scientist how to be an accountant,” observes Rubin, “but you can, through Alteryx, enable an accounting person to operate like a data scientist.”
“You take your unsung heroes within your operational roles and provide them with a toolset that can allow them to operate at a much higher skill level,” is how Rubin puts it. That’s the notion of the democratization of analysis mentioned earlier.
The result is more and more subscriptions can be sold for more and more people in an enterprise to use the programs. Fifteen percent of the workforce inside any enterprise are potential Alteryx’s users, based on estimates the company has compiled with research firm IDC.
“The penetration that we currently enjoy in the customers we do have is still incredibly low,” says Rubin. “So, there is a lot of opportunity to continue to grow this business within the existing installed base for many years to come.”
Of course, revamping a company’s entire way of selling and supporting customers has a cost, and Alteryx’s free cash flow has taken a hit this year, dropping from almost fifty million dollars in positive free cash flow last year to what the Street projects will be negative free cash flow of eighty million this year.
How does the investor base feel about that? I ask.
“In the investor conversations that I've had since we reported, I think there is deep appreciation for the top-line performance of the business,” says Rubin. At the same time, he is aware, “there’s a much greater focus today on profitability and cash flow than six months ago.”
Rubin has emphasized to investors, he says, “our commitment to leverage in our financial model, and driving profitability and cash flow over time.” To that end, the company is slowing hiring now, and is making “significant” real estate changes, basically, reducing its footprint as more people work from home. Those two things, hiring and real estate, are Alteryx’s biggest costs, he notes.
“I think over time you're going to continue to see the business take steps towards the long-term model that we published most recently in May” at the analyst meeting, “and demonstrate to shareholders the growing earnings power of this business.”
That financial model that Rubin put forward in May implies Alteryx will at some point be generating operating profit margin of as much as thirty percent, up from two percent last quarter, and a free cash flow margin that’s positive twenty percent to twenty-five percent — both significant improvements if they come about.
The sharply better results this past year suggest much of the heavy lifting in CEO Anderson’s turnaround push has now been completed. However, “If you ask Mark, he would not suggest at all that we’re done,” Rubin tells me.
“I don't think any of us look at this as if, you know, we're turning a corner per se, and taking a big sigh of relief.
“We have a commitment to delivering a lot of innovation and really scaling this business in a meaningful way,” he adds. "And we're in the early innings of doing that.”
Alteryx’s shares are down twenty percent this year, and basically flat with where the stock was immediately following the earnings report.
The TL podcast for Nov 13th: Picking on Tesla, Upstart baffles, and what happened to FAANG during The Great Recession
Nov 13, 2022
Coherent (COHR) and DigitalOcean (DOCN), two TL20 names, both had well-received earnings reports. Upstart Holdings (UPST) had a baffling earnings call. The obsession with Elon Musk is a bit much. And a look back at what happened to the mega-caps of tech during The Great Recession.
TL20: Not dumping Tesla yet
Nov 11, 2022
You can say all kinds of things about Tesla, like, for example, the fact that its forthcoming Cybertruck is the ugliest vehicle ever designed. But, like a lot of other assertions, that has little bearing on the shares of Tesla as an investment.
The urge to dump Tesla stock seems to have reached a boil in recent days.
Take analyst Dan Ives of Wedbush Securities. A week ago, Ives defended Tesla when talking to Barron’s Al Root, telling Root that challenges to Tesla were “a near-term storm that will pass.”
Thursday morning, however, in a note to clients, Ives threw in the towel, removing the stock from his “Best Ideas” list. He has, he indicated, finally lost patience with Musk “crushing Tesla stock.”
The issue is Musk’s stock sales in support of his Twitter purchase, which he consummated on October 28th. The sales are not helping Tesla’s stock price at a time when Tesla’s delivery of vehicles is challenged.
Writes Ives,
This is a very nervous few months ahead for Tesla investors as they remain the ones that have been punched again and again by the Musk Twitter antics and the stock now is deep in the investor penalty box until deliveries hit in early January and we get a better sense of the 2023 delivery/production trajectory.
Perhaps waiting till 2023 is good trading advice. I, however, like to look at stocks in terms of fundamental, lasting value.
Tesla stock is in the inaugural batch of TL20stocks to consider, and given the market cap weighting, Tesla is the biggest drag because it is the biggest company. Yet I see no reason to strip it out.
If you were concerned about Musk’s selling of Tesla stock, it’s spilt milk at this point. As mentioned by Bloomberg’sKit Rees and colleagues on Wednesday, Musk has sold thirty-six billion dollars worth of Tesla this year. As Deutsche Bank’s Emmanuel Rosner writes this morning, after taxes, Musk pockets about twenty-five billion dollars, “which just about covers the required capital to complete the deal,” the bulk of the remaining twenty billion being made up by syndicate debt and by investors such as Suadi prince Alwaleed Bin Talal.
More important, I have not seen a fundamental change in Tesla’s profile as a highly successful car maker whose stock was and is reasonably valued.
There are worries about Tesla losing share, especially in China, as Toni Sacconaghi of Bernstein recently emphasized. That’s a valid concern, but competition is never a reason to dump a stock.
The field of young contenders, moreover, is a mess. Rivian Automotive, Lucid Group, and Faraday Future continue to miss expectations as they struggle to get to volume production.
There are practical hurdles to Tesla’s growth that really are meaningful, in particular the need to build out many thousands more charging stations in places all over the world. That’s a serious issue, but so far, it is not a deal-breaker for hundreds of thousands of owners taking delivery of Teslas each quarter.
There are things to quibble with about the product, some large and some small. For example, the forthcoming “Cybertruck,” Tesla’s first pickup, is, in my humble opinion, one of the ugliest vehicles ever conceived. I wouldn’t buy it with your money. But I expect it will find its place in the marketplace, so I have no expectation it will be a drag on sales.
Much more important are a series of investigations into accidents linked to Tesla’s self-driving software, including a Justice Department criminal investigation “over claims that the company's electric vehicles can drive themselves,” according toReuters. It’s early to gauge the potential effects of the investigation, though it is conceivable charges could some day cast a long shadow over Tesla, and that gives one pause.
Beyond the real stuff, many things about Musk as a person get people worked up. He has a big mouth on Twitter, though so do a lot of other people. And to many Twitterati, he is destroying a great service. I for one don’t read Musk’s tweets, and I’ve never loved Twitter; I would just as soon see it disappear. To me, these kinds of things are moot in terms of Tesla stock assessment.
Frustration with Musk is practically an industry, but Tesla has succeeded over many years in spite of his antics. Again and again, promises that Musk made to Tesla shareholders were broken, such as the timing of the introduction of new models. Nevertheless, the company got stronger in the areas that matter most. New products came out, production increased, and Tesla became profitable in the face of enormous skepticism.
At this particular moment in time, Tesla is a powerful company with a leading position in an important industry whose shares have been substantially discounted. That is the core argument for the stock and most of the rest is just noise.
DigitalOcean CEO: It’s hard talking strategy when the Street wants to talk recession
Nov 11, 2022
I sat down with Yancey Spruill, chief executive of DigitalOcean, Thursday afternoon, to review what he and CFO Bill Sorensoncommunicated to analysts on Monday.
My main question in these types of interviews, where I have an executive’s full attention for a half an hour, is, Did you communicate what you wanted to communicate?
Yes and no, it turns out. Spruill wants to talk about the company’s ongoing efforts to build a stronger business. It turns out it’s hard talking strategy when everyone wants to talk about recession.
“We made some major strategic actions” last quarter, said Spruill. “The Street is trying to look through that and say, well, absent that, that something’s different in the underlying business.”
To Spruill, the fundamental mission is the same it has been: revenue growth above thirty percent per annum, and a target to get to a free cash flow margin of twenty percent of revenue or better by 2024 from what is mid-teens now.
“Despite war, inflation, currency effects, and weakening economies, we have been able to execute on those objectives, to grow rapidly and to drive free cash flow,” he says.
DigitalOcean, you may recall, is a competitor to Amazon AWS and the other cloud providers. It is focused on being a more economical version of cloud computing. And it is specifically targeting small and medium-sized businesses, who make up the bulk of the company’s seven hundred thousand or so customers.
As I wrote on Monday, Spruill and Sorenson both spent a lot of time reassuring concerned analysts that small and medium-sized businesses are holding up well.
“There’s a lot of skepticism about the durability and resiliency in this type of environment,” Spruill says of the analysts’ attitude. His point is that his customers are no more challenged than, say, Microsoft’s.
“We have looked at historical patterns in other recessions,” Spruill tells me. “What we see is that small and medium businesses perform similar to enterprises.”
As a cohort, he says, they “tend to be like cable and utilities: they tend to be pretty stable; that’s the history of ours.”
If anything, small and medium-sized businesses are ill-understood.
“Most businesses under five hundred employees are not public, they are not well-covered” by the Street, observes Spruill. “I think there is a lack of awareness of how substantial that SMB economy is.” Spruill is fond of pointing out small businesses make up half of the global economy.
His customers, he says, are not going away, but they are grateful that DigitalOcean lets them dial up or dial down their spending every month — known in cloud computing circles as “consumption-based pricing.”
“I talk to as many customers as I can,” he tells me. “They said, ‘Thank you for having this consumption pricing,’ because they are looking at every dollar,” in order to economize.
That’s all well and good, I point out, but isn’t DigitalOcean holding the bag, then, for small businesses? Maybe he never loses a customer to a competitor, but what if his customers cut their spending with his company by ninety percent? That’s not great for DigitalOcean.
Three years ago, DigitalOcean was bleeding cash, observes Spruill, and now is on track to twenty percent free cash flow margin. “To flip like that by four thousand basis points inside of three years is not typical, certainly not in the technology industry and software.”
“You’re right, but we are not levered to a particular country or industry vertical,” says Spruill. The company’s seven-hundred thousand customers are all over the world. “We take our lumps, and we have other customers that are growing.”
By taking one’s lumps, Spruill argues, DigitalOcean is less likely to be surprised down the road. “The thirty-day [payment model], when consumption goes down, you feel it immediately.” That means, he says, “we normalize immediately, whereas, if you have one or three-year contracts” like traditional software vendors, “you get lulled” into a sense of security, and then “it hits you like a wall.”
“I think you’re seeing that in a lot of enterprise contractual subscription businesses,” he says, “that the renewal market is very challenging right now.”
Among the strategic actions his company is taking is M&A. Last quarter, the company bought privately held startup Cloudways. That company is enabling DigitalOcean to add what’s called a “managed service,” which means that it runs cloud computing with much more hand-holding for the customer, more attention.
I asked Spruill how managed services changes the nature of his profit and loss over time. The benefit is both additional revenue but also, perhaps more important, greater free cash flow and greater return on investment.
Cloudways’s gross profit margin, “might be slightly below ours, but not much,” he says. More important, Cloudways is less capital-intensive because unlike DigitalOcean, the company has been reselling other cloud providers’ infrastructure. That’s the opposite of DigitalOcean, which has been investing in property and plant to build its own computing infrastructure.
Now, says Spruill, DigitalOcean can put Cloudways on some of DigitalOcean's own infrastructure, and Cloudways can stop writing checks to providers.
“We can provide the cloud infrastructure instead of a third party, and their costs drop pretty significantly, and their gross margin goes up, becomes closer to ours,” he says. At the same time, the combined entity gets better capital utilization, he says. “We would amortize across the managed service, at a higher price point, and so therefore it’s significantly accretive to free cash flow over time.”
The other strategic move Spruill talked about is selling more data storage along with the cloud computing it sells. Again, to play the devil’s advocate, I asked Spruill whether that means being drawn into a commodity battle with Amazon and Google and Microsoft, companies that can buy storage gear at scale, and that can even afford to lose money on storage if they have to.
“I wouldn’t say anything we do is commodity,” says Spruill. Although DigitalOcean is, on average, half the price for comparable offerings as Amazon AWS, he says, “it’s not really about the price, it’s about adding the functionality that allows them to run whatever type of digital business that needs certain performance characteristics, on our platform.”
Customers, he says, end up using multiple cloud providers, and “typically that is for storage,” he says. “We are talking about closing the gap on those needs” so customers can give more of every dollar to DigitalOcean.
Again, if he can just get the Street not to obsess over recession, Spruill might be able to hammer home to them that strategic things like storage and managed services are a part of an ongoing effort to make DigitalOcean “a free cash flow machine,” as he likes to put it.
“Our view is, any strategic actions you take over the life of the company are additive,” he says. “No one hands you thirty percent [revenue] growth, you have to do it every year.”
Three years ago, DigitalOcean was bleeding cash to the tune of negative twenty-five percent of revenue. Now it’s generating positive free cash flow in the mid-teens. “To flip like that by four thousand basis points inside of three years is not typical, certainly in the technology industry and software.”
“I’m really proud of what we’ve been able to do in terms of getting better asset utilization, and serving customers.”
When will the stock reflect all that? DigitalOcean is one of my inaugural picks for the TL20 list of stocks to consider. The shares are down sixteen percent since that inauguration. And the multiple has been cut by more than half in the past twelve months, to less than five times as a multiple of enterprise value divided by projected next twelve months’ sales.
If the company continues to deliver on the two goals, revenue growth above thirty percent and expanding free cash flow, on the way to revenue of a billion dollars in 2024, “The current stock price is going to be in the rear-view mirror,” says Spruill.
“A lot of the value we’ve been creating here, including substantial free cash flow and huge returns on capital as we scale this, are attractive in good markets and bad,” he says.
“The numbers bear it out, and I’m confident, over time, the market will value that appropriately.”
“We’re rocking and rolling,” adds Spruill. “I have passion about this, these markets can be great; we keep investing and keep the faith.”
Dynatrace CEO: We’re keeping companies away from the cloud precipice
Nov 10, 2022
Every software maker is saying this earnings season that it has gotten harder to sell software. Deals are now requiring extra scrutiny, extra signatures, and customers are putting off some purchases for a later date if not canceling them outright.
For many a software maker, that means cutting forecasts and crimping operating expenses. But for most, it is also a chance to reiterate to customers, and the press, why the software is important, perhaps even essential.
Software maker Dynatrace is one of those vendors that has had to trim its outlook. When the company reported third-quarter results last Wednesday, it cut its forecast for its “annualized recurring revenue,” one of those non-GAAP metrics analysts focus on. ARR, as it’s known, is the total value of contracts signed stretching twelve months into the future.
Despite that cut, when I sat down last week on a Zoom call with Rick McConnell, Dynatrace’s CEO, his emphasis was less on what he can’t control — the economy — and more about why, as he sees it, the software he sells is more valuable than ever, even, as he has told me in prior interviews, indispensable.
“The weight and the complexity of data continues to increase because of cloud,” says McConnell. “It’s too much to try and manage using just humans sitting at a dashboard.”
The Dynatrace software, lumped into categories such as “DevSecOps,” and “observabilty,” is a tool IT can use to automate some of the constant surveillance of its applications, looking for problems, looking for things that break, a process that, without some form of automation, is really whack-a-mole, says McConnell.
“Companies are getting hordes of customer calls saying, Oh, my god, your software is down, I can’t make a bank transfer, I can’t buy your product, whatever it might be,” he says, describing what we all have experienced with regularity.
“My hypothesis is that the problem is getting not modestly worse, it’s getting much worse,” McConnell tells me, “because I can’t find the labor, it’s harder to find the problem in the software, I have too much additional software, I have too much infrastructure — it is impossible to manage that in the way they have done in past.”
Companies’ IT departments traditionally try and triage the mess with primitive bespoke tools. Often, they can’t even see what has gone wrong.
“It’s going to get to a precipice where they can’t operate effectively,” and competitors who can operate will pass them by, predicts McConnell, a kind of Darwinian cloud epoch. With labor tight, the talent to keep up with the operations of the cloud is daunting.
“We’re running at full employment,” observes McConnell. “There’s huge wage pressure, it’s hard to recruit people into positions to manage an escalating, —almost exponentially escalating — array of software in a manual way.”
“At some point, that isn’t feasible anymore.”
The idea that companies are in crisis when they use cloud, and heading for potential disaster, and that good software tools become essential, is an intriguing thesis. And yet, it has not spared the stock of Dynatrace nor those of its competitors this year.
Companies are getting hordes of customer calls saying, Oh, my god, your software is down … The problem is getting not modestly worse, it’s getting much worse … It’s going to get to a precipice where they can’t operate effectively.
Dynatrace is down forty-three percent this year, including a six percent drop when it cut its outlook last week. Some are better, some are worse, including Splunk, down thirty-eight percent; Sumo Logic, down fifty-one percent, and Datadog down sixty-two percent, to name just a few obvious contenders. Investors at the moment don’t seem to care how damned important the application.
Even if investors don’t fully grasp it, McConnell is fired up about what his customers say the software has done for them.
McConnell, who was president at Internet bandwidth firm Akamai for over a decade before coming to Dynatrace last year, says he loves to meet with customers. The past few months, Dynatrace held a series of customer events in Vegas, London, São Paulo and Singapore. He met with ten customers a day, over a thousand customers in all. They all have a similar problem of operations being strained.
“I asked the head of a utility company, What do we do for you? He said, You keep the lights on in our country, you let us keep the grid going.” That is a testament to the pervasiveness of software, he says.
“We think of financial companies and services companies as the primary spenders on technology,” observes McConnell, “but these days it’s supermarkets, theater companies, utilities — the world is running on software.
Even in proof of concept bids for business, before a sale, the software has demonstrated its critical utility, McConnell says. A prospective customer, a supermarket chain, was testing the Dynatrace software in real time on live activity at the stores in conjunction with a customer loyalty program.
“It doesn’t sound as critical, but, by God, if I put in my phone number at the register and it doesn’t work, and I don’t get my three dollars and fifty-seven cents, I am frustrated.”
The chain did, in fact, see the system go down, at one point, and “They fixed it as a result of the precision of the answer that we provided as to what the problem was, in fifteen minutes.” The company “immediately” became a customer, he says. “They said, Had we not had Dynatrace in there, that would have taken hours or days.”
If the utility shines through for customers, what is going to keep the investor around in this time of very sour software attitudes? Growth and profitability, presumably, even with last week’s diminished outlook.
“One thing we know is that downward trend cycles recover, and we will see that recovery,” says McConnell of the current economic malaise. Once the recovery comes, “then the question will be, What position are you in to take advantage of it?”
Even with the challenges of the moment, Dynatrace, which is expected to turn in $1.1 billion in revenue this fiscal year ending in March, has been able to keep close to the “Rule of 60,” the Street shorthand for financial profile growth and profit.
Dynatrace had twenty-six percent revenue growth last quarter, and that would have been thirty percent if not for the deleterious effect of the rising U.S. dollar, which depressed reported revenue. The company’s non-GAAP operating profit margin in the quarter was twenty-six percent, which puts the company in a different class from a lot of other software makers that are not even profitable.
Dynatrace doesn’t need to tighten its belt, says McConnell. “We had said a few quarters ago we would accelerate our investments,” he says, meaning, spending. “And we’re largely done with that now.”
“The only thing that it changes is our trajectory of hiring,” he says. “We can just moderate that modestly” because “we believe we have the sales capacity we need at modestly lower sales headcount growth rates to deliver the performance we are expecting in the company.
“A lot of that is due to having been really successful in building out that team over the first half of this year.”
As far as sales growth, McConnell has been adding to the executive team at Dynatrace with seasoned talent he says know how to take the company from a billion dollars annually to several billion. The latest addition is a new CFO, Jim Benson, who also hails from Akamai. He was instrumental along with McConnell in building up Akamai from a billion dollars in annual revenue to over three billion.
“I can’t wait to work with him again,” says McConnell. “He’s our rockstar, just delighted to have him at Dynatrace.”
What no one can control is what valuations the Street will currently assign, which are way lower across the board for all software stocks. As a multiple of enterprise value divided by the next twelve months’ projected sales, Dynatrace trades for just under eight times where a year ago it was fetching twenty-one.
“Is that the right valuation for a promising young tech company that’s growing fast and is already profitable?” I ask McConnell.
“I can’t give predictions on multiples, but I can say that I really believe — firmly — in the long-term optimism and capabilities of our business; that’s why I’m delighted with our ongoing execution even amidst economic uncertainty.”
The investor feedback he hears these days is, “Where is the bottom of the market, where we can see a re-acceleration in overall multiples and stock prices?”
Maybe that takes a couple more or a few more quarters, he says, but, “One thing we know is that downward trend cycles recover, and we will see that recovery,” says McConnell. “It may take one or two more interest rate changes, but downward trend cycles always recover.”
Once the recovery comes, says McConnell, “then the question will be what position are you in to take advantage of it?” He intends for Dynatrace to be well-prepared.
Upstart plummets: It’s not a bug, it’s a feature?
Nov 09, 2022
A year ago, Upstart Holdings, which develops artificial intelligence to approve personal loans, was on such a roll that its CEO and co-founder, David Girouard, proudly compared his company to a great athlete:
Since Upstart's IPO a year ago, we've more than tripled our revenue, tripled our profits, tripled the number of banks and credit unions on our platform, and tripled the number of auto dealerships we serve. With that many threes, Upstart is becoming the Steph Curry of the FinTech industry.
I don’t know enough about sports to know just who would be the anti-Stephen Curry, but that’s rather what Upstartlooks like these days.
Upstart Tuesday reported its second quarter in a row in which revenue and profit fell short of expectations, after a prior six-quarter streak of upside surprises. For the third quarter in a row, its forecast was also less than expected.
Shares plunged by twenty-four percent in late trading this evening, the third quarterly sell-off on disappointment. The stock, at an after-hours price of $14.47, is now down ninety percent this year, and down fifty-one percent from its closing price on its first day of trading following its initial public offering in December of 2020.
Upstart’s premise, expressed in its IPO pitch, was to dramatically expand the writing of loans to many more Americans by producing a more realistic, less-biased risk model of borrowers. As the company said at the time, too many people were being denied funds. “Four in five Americans have never defaulted on a loan, yet less than half have a credit score that would qualify them for the low rates that banks offer,” the company observed. The mission was to get many more people a loan by using AI.
What has happened this year, however, is that a rising rate environment has torpedoed the writing of loans. In the accompanying chart you can see the company’s two key metrics, reported every quarter: how many loans were issued by the company’s lending partners, and what percentage of loan inquiries actually ended up turning into a loan, the conversion rate. Those two metrics were rising from 2020 through last year, and have since been in decline.
The amount of loan volume has plunged as rates rise and credit sources become more reticent, and conversion rate from inquiries to actual loans has declined as well.
Here’s the table as well:
Things started to come apart with Upstart’s business in May as the revenue upside slowed. I wrote back then that the worst was probably yet to come, and things have indeed gotten worse.
The September quarter results that Upstart offered Tuesday night included the first time since the IPO that revenue declinedinstead of rising, dropping thirty-one percent from the year-earlier period.
On this evening’s conference call, Girouard told analysts, “Our results in Q3 were certainly not what we wanted them to be, but I also believe they reflect the Upstart team making the right decisions in a very challenging economic environment for the long term success of the company.”
The sharp decline in loan volume, said Girouard, is a combo of the company approving forty percent fewer borrowers who apply, and interest rates being eight percentage points higher than a year ago. Higher rates make loans less attractive to borrowers, on average.
Consumers are strapped, and their financial profile is becoming worse and worse. The consumer personal savings rate in September was down to 3.3%, according to Upstart’s CFO, Sanjay Datta, which, he noted, was “a level not seen since the great financial crisis” of 2008 to 2009. At the same time, credit card balances, said Datta, have swelled to record highs as people pay for goods at inflated prices without the benefit of higher incomes.
The result, said Datta, is that defaults for consumers are surging to a level twice what it was prior to COVID-19’s arrival, and twenty percent higher than it was just three months earlier.
The weak outlook in Upstart’s own forecast for revenue is a reflection, said Datta, that the company expects things will get worse for the loan business in the near future. In fact, Girouard tonight used my phrase from May: “We assume the worst is in front of us.”
Now, all of this macroeconomic woe is, of course, what everyone is observing, and there’s nothing really surprising in any of it. What is interesting is what it says about Upstart as a business, and what the company itself thinks it says about its business.
I have been skeptical about Upstart since before its IPO. My skepticism was rooted in the fact the company tossed around the term “AI” without really disclosing what it was doing. Rather than amazing technology, I said, what seemed most important in the company’s strategy was that the company had put together a system of loan originators and credit financiers who both were chasing yield at a time of super-low yields.
As I wrote back then, “there’s precious little of substance about AI in this pitch, and a lot that’s very interesting about credit and about the lending business.”
Tonight, on the call with analysts, Girouard not only stuck to the script that his company’s technology is revolutionizing lending, he leaned in to that premise — hard.
In one of the most amazing instances of putting a brave face on things, Girouard told analysts that “contraction in lending volume in a time of rising rates and elevated consumer risk is a feature of our platform, not a bug,” emphasis my own.
The phrase Girouard used to is an old joke in computer circles about how a programmer will dismiss the complaints of computer users by arguing that what seems broken to the user is actually brilliant but misunderstood design.
In Girouard’s version of the old saw, his technology is getting better and better during this downturn even if it looks like the business is struggling.
Which leads one to ask, In what way would plummeting revenue and deep losses — free cash flow was negative four hundred million last quarter versus positive a hundred and eighty million a year earlier — be deemed a “feature” rather than a “bug”?
Somehow, and it’s not immediately clear how, the AI that has never really been explained by Upstart is getting better during this bleak period for the business.
“Our AI models have never been more accurate relative to a traditional FICO-based model,” said Girouard. The word “model” here is data science parlance for when you construct an expectation of how things function, like a guess about the nature of things.
That guess, said Girouard, is getting richer and richer as things go south. “Our pace of model development has increased significantly,” he said. “To be more specific, the increase in Upstart's model accuracy in the last four months is as much as we saw in the prior two years.”
Now, all of this sounds like hyperbole similar to the IPO prospectus, in the sense that there is nothing specific offered as to what “accuracy" means.
The company has developed, said Girouard, an index, called the “upstart macro index,” or “UMI.” The UMI, said Girouard, is “a monthly indication of the state of the economy, specifically with regard to consumer financial health and credit performance.” He and CFO Datta frequently came back to the UMI during the evening’s call, referring to it as being a way to plot what’s going to happen in the economy going forward. The UMI, in other words, is kind of the weather prediction version of AI for finance, a tool that would forecast incoming storms.
And yet, when asked by somewhat nervous-sounding analysts to, in fact, predict where things are going for lending, Girouard and Datta sounded about as sage as the average weather forecaster, which is to say, not very.
When, for example, analyst Arvind Ramnani with Piper Sandler, asked Girouard and Datta, “What are some of the downside scenarios, like, I mean, if macro gets a lot worse, would you expect, like, kind-of, further deterioration in your business just given, sort-of, the strong exposure you'll have to the macro?” — a perfectly reasonable question — the answer from Girouard was oddly not very prophetic.
“Look, any business looking to the future of the economy, there are downside scenarios,” he replied. Well, I don’t think anyone needs AI to tell them that!
A little later, Datta was asked by analyst David Chiaverini of Wedbush about what the UMI can predict. He cited a slide in the company’s investor deck of slides that showed defaults expected to keep rising. I’ve copied it here:
Chiaverini’s question for Datta was, If the UMI index is showing seventy percent more defaults than normal right now, which it is, should that be taken as a prediction of future defaults?
“Should we expect this line on page 11 to go up towards 70%, just could you talk through that a little bit?” asked Chiaverini.
The response from Datta, to me, suggested that the UMI and all the AI stuff is not actually predicting anything at all. In fact, what is going on is that Upstart is drawing a continuation of points in space and assuming for the sake of argument that things get worse and worse for some unspecified period of time:
So today we're pricing loans at a 2.0 sort of equivalent macro index. So to put another way, if that macro index stays at 1.7 and we're pricing new loans at a 2.0, they should in fact over perform that they should come under losses by, you know, to the tune of, you know, at 17% or 17%, 20%. So because we rapidly adjust the model to recalibrate to where the – sort of UMI is trending — we are, sort-of, able to, in a sense, price these trends into the loans.
In other words, there is nothing in the AI model that is predicting the future. Instead, the company is raising what it demands of borrower profiles to fit an assumption that rates of default keep going up.
I hope that even those with no more than a little data science background can see that simply drawing a line that continues present trend is not “predicting” anything in any meaningful way. It’s really just the old practice of fitting equations to data. You have a bunch of data points, you try to discern what line they are telling you to draw.
In fact, what is actually going on with the amazing AI models is that the collapse of lending and borrowing is now being used by Upstart as a test tube in which to see what happens in a negative scenario.
“While we dislike the weakening economy as much as you do,” said Girouard, “the increasing default rates that accompany this weakness serve to train our AI models faster while other platforms continue to retreat to serving super-prime consumer.”
Translation: We can’t actually predict anything, but as things collapse, we’re updating our models with the expectation that we’ll be smarter at the end of it all.
Now, in a sense, that’s true: you can’t statistically model things without the negative scenario. But that also means that it remains to be seen whether the AI here, or whatever it is, is going to at some point demonstrate an amazing ability to predict things that it cannot currently predict. For the time being, the model is just ratcheting back on lending as consumers get riskier.
Which brings us to what is perhaps the real “feature” of Upstart’s business: cash, lots of it.
Upstart had only a modest IPO in 2020, with proceeds of less than two hundred million dollars. But the balance sheet has since amassed a healthy cash balance of over eight hundred million dollars.
Said Girouard, “As you know, we've got about $800 million in total cash on the balance sheet, so, that can take us for quite some time.” Indeed, the burn rate, the amount by which expenses exceed revenue, was just fifty-five million dollars last quarter, a mere pittance compared to that cash pile.
What is a feature, then, is the ability to keep losing money for a while, and not have to go back to capital markets, which Girouard assured analysts tonight the company does not need to do anytime soon.
Whether this is an amazing AI business is, to my mind, an item still unproven. But it appears to have the money to be a business that doesn’t default anytime soon.
The one thing I still wonder about after tonight’s call is the enormous pile of loans on the company’s balance sheet, which I’ve laid out in another chart, the actual numbers being in the table above:
The amount of Upstart’s loans held on its books has swelled this year to $700 million.
Upstart’s business is to be a technology provider and let others do the lending. However, in the past year, the dollar value of loans on the balance sheet has swelled from two hundred and fifty million dollars’ worth, at fair value, to seven hundred million dollars’ worth. One wonders, where does that go? And if default rates were to keep rising, according to the UMI index, does it become a problem for the company to have so much paper on its books?
Others are wondering that as well. Ramsey el-Assal with Barclays asked Datta, “I’m just curious, in terms of going forward, what are your plans there? Do you intend to stabilize that number here?”
Good question. Datta’s response, again, was not a prediction of the future, but, more or less, keeping all options open.
“I don't think we've necessarily guided a specific guideline or a number with respect to a balance sheet,” he said. “You know, whether we draw it up or draw it down over the next quarter or so, will continue to be an operating decision.”
That doesn’t sound to me like AI, or any kind of model, it just sounds like, “We’ll see.”
DigitalOcean says small businesses are ‘resilient’
Nov 08, 2022
The initial read on small businesses in tonight’s report from DigitalOcean is encouraging if not decisive.
DigitalOcean, you may recall, is a competitor to Amazon AWS and the other cloud providers. It is focused on being a more economical version of cloud computing. And it is specifically targeting small and medium-sized businesses, who make up the bulk of the company’s six hundred thousand or so customers.
The report Monday evening of third-quarter results was better than expected for revenue and profit, which was an improvement from the last report, in August, but the forecast repeated the pattern of the prior two quarters with revenue missing expectations.
CEO Yancey Spruill on tonight’s conference call with analysts said the company continues to see the impact on its customers base from a combination of factors, including “a global economic slowdown, high inflation, US dollar strength, the Russia, Ukraine war and the decline in blockchain.”
All those things are leading to slower growth than expected among his customers. DigitalOcean’s service is able to be used on a “consumption” basis, which means customers can dial up or down how much of the service they use within a quarter, and thereby pay more or less. Right now, some are paying less than traditionally was the case.
But Spruill was also vigorous in his defense of small businesses.
“The SMB [small and medium-sized business] economy is roughly fifty percent of global GDP,” he said, “it's not going anywhere, and although it is not immune from the broader trends impacting our economy, smaller businesses are demonstrating how nimble and resilient they are during this period.”
Spruill sought to dispel misconceptions:
There is a perception that SMB is subject to a more significant impact than enterprise from weak macroeconomic conditions. Facts suggest otherwise as the Hyperscale cloud providers [Amazon etc.] and other software companies, who are principally focused on enterprise customers, have reported similar levels of declines to their growth rates as us during this year. We believe having an SMB-focused business with geographic industry and business model diversity in a consumption-based model is a key strength of our company.
Indeed, to Spruill’s point, when Amazon and Microsoft and Alphabet all reported results last week, they all indicated that their cloud customers were slowing spending in order to tighten their belts.
Earnings to date.
See the full list at the bottom of the post.
On the call, CFO Bill Sorenson offered some data about what’s going on with customers. He said DigitalOcean had done some surveys of customers who’ve cut their spend. “More than half of those surveyed cited needing fewer resources as the main driver of their decreased spend, suggesting that our customers' demand environment has reduced, which is not surprising given the global backdrop.”
An interesting wrinkle is that DigitalOcean raised prices during the quarter. Spruill told analysts that it’s had a beneficial impact that “exceeded our expectations.”
Spruill said the price increase added over a point of growth to the company’s revenue, which rose by thirty-seven percent last quarter, meaning, an additional thirteen million dollars in the quarter. Along with the price change, the company introduced a new lower-priced option for some services. Despite the economic turmoil, noted Spruill, the company’s churn last quarter came in lower than expected, and the number of customers that traded down to the cheaper plan were fewer than expected.
The economic “headwinds,” said Spruill, will ultimately “dissipate,” and as they do, “our new pricing framework is going to endure as we continue to lead in the SMB cloud with a differentiated platform.”
The main pitch for the Street on the call is that DigitalOcean is continuing to focus on two things: revenue growth above thirty percent, and a target to get to a free cash flow margin of twenty percent or better by 2024.
On the first score, Spruill said that “we expect to grow at least 30% next year from the midpoint of the 2022 guidance we're issuing today.” On the second score, he assured analysts, “We're going to get to 20% or better in 2024, which is only fourteen months away.” He repeated a short while later, “We will be hitting 20% or better free cash flow when the calendar flips to 2024.”
Spruill is exploring different avenues to make sure the company maintains those goals. Revenue, in particular, is an area where the company is looking to find additional avenues of growth. “When you get over five hundred million bucks [in annual revenue],” said Spruill, “you’ve got to start looking at other ways to drive revenue growth if you want a three handle on it,” meaning thirty percent or better annual revenue growth, “and that’s what we’re doing.”
One really interesting opportunity is selling more data storage along with the compute services it offers. Spruill contends that Amazon and the other cloud majors are gouging customers on the cost of storage, and that the category is ripe for taking share.
“Our storage revenue currently is high-single digits as a percentage of our total revenue,” said Spruill, “and we believe based upon benchmark and customer surveys that we can double the revenue mix percentage from storage-related capabilities over the next few years.”
Added Spruill, when storage is added to customer plans, it currently increases the revenue per user, on average, by more than twenty-five times the rate without storage.
And the second interesting avenue of exploration is that DigitalOcean bought a privately held startup during the quarter, Cloudways, a ten-year-old firm headquartered in Malta for which DigitalOcean paid $350 million.
Cloudways operates as a “managed service,” which means that it runs cloud computing with much more hand-holding than DigitalOcean, taking care of the management of the services on behalf of the customer rather than just selling raw capacity.
A managed service can reduce some lost business, said Spruill, by keeping those customers who find they’re not getting enough help. “Often we see that the customers that churn in the first few months in our platform cite that they were looking for a managed experience, something that we have not offered until now,” he noted.
And the Cloudways customers can be more lucrative, he said. “As a good indicator of the value of this customer intimacy, Cloudways generates two-X the pricing for a similar-sized customer footprint.”
As for what happens with the economy, it’s really hard to say, CFO Sorenson indicated when analyst Pinjalim Bora of JPMorgan asked what to expect.
“Going forward, the headwind calculation, Pinjalim, is anyone's guess at this point,” said Sorenson. “I don't think we're necessarily out of the woods.”
He noted that customers spending more money seem to be fairly “resilient. “What we're seeing there is our greater-than-$50, and even our greater-than-$250 [a month] customers, are still showing resilience.
“But it's hard to basically estimate what we think the impact would be going forward.”
I’ll have more on DigitalOcean when I talk with Spruill later this week. Stay tuned.
Shares of DigitalOcean declined fractionally in late trading. At a recent price of $29.40, the stock is down seventeen percent since I picked it for the TL20group of stocks to consider in July.
The TL20: DigitalOcean on tap
Nov 07, 2022
The TL20group of stocks to consider is having a better November so far than the broader market. The group declined two percent last week, better than the nearly six percent sell-off of the Nasdaq Composite Index. With the group now halfway through earnings reports, the results have been pretty good.
With eleven of ten having reported, the majority of the reports have been better than expected, and the forecasts, for those that forecasted, have been better than expected. And the average stock “pop” following the report has been a decent four percent.
Arista Networks has been the big start of the season, reporting better-than-expected results this past Monday, and a stunning forecast for the year ahead when it held its analyst day on Thursday.
I’ll be especially interested Monday to see how Digital Ocean holds up. The company is singularly dedicated to the small and medium business crowd. Last quarter, when I talked with CFO Bill Sorenson, he expressed confidence the company will continue to maintain profitability by controlling costs even in a “tough environment” in terms of the broader economy.
Results in August were more or less in line with consensus, after five quarters of outperformance. The question Monday will be whether we see a breakdown in the company’s ability to hold onto those small business customers.
The TL20 is down thirteen percent since inception on July 15th.
The TL podcast for Nov 6th: Recession-resistant stocks and silicon carbide futures
Nov 06, 2022
Very mixed week for earnings, with Arista Networks the star, Wolfspeed’s controversial plan for silicon carbide, and some thoughts on stocks in The Great Recession.
Arista stuns the Street with a big revenue view for 2023
Nov 04, 2022
While this earnings season has seen a lot of wrecks as a result of worsening macroeconomic trends, the area of computer networking so far appears surprisingly resilient.
In particular, Arista Networks, which sells equipment to hook up computers inside data centers, seems to see no end in sight for its wares.
After beating expectations on Monday evening, and offering a forecast higher as well for the current quarter, the company on Thursday stunned the Street with its analyst day meeting. CEO Jayshree Ullal offered a prediction that Arista’s sales will be ten percent higher next year than anyone’s been expecting, $5.5 billion dollars versus the current consensus for just under five billion.
There was quite a bit of gushing by analysts writing home this evening. Wells Fargo’s Aaron Rakers calls the event “what we think should be considered a much stronger-than-expected update” (emphasis Rakers’s.) He notes the buy-side has been looking for mid-teens revenue growth on a percentage basis in 2023, but this bumps it up to twenty-five percent growth.
Amit Daryanani with Evercore ISI this evening writes that the event “was more bullish vs. high expectations into the event,” adding that the company “is positioned as not just a 12-month but a multi year story as they disrupt the networking industry.”
There was a lot else that was talked about during the proceedings, much of it having to do with software, both offering more software to customers, and also updating the nature of software in networking for a modern age. Fairly fascinating stuff I hope to reflect on some more in the coming days.
But for the moment, it’s the revenue outlook that is taking everyone’s attention. Arista shares rose by five percent in late trading. The stock is now up twenty-seven percent from when I selected it in mid-July for inclusion in the TL20group of stocks to consider, making it the best-performing name of the group.
Block, Microchip defy the economy, Twilio succumbs
Nov 04, 2022
It was another eventful earnings day, Thursday, with stark disparities in the winners and losers. Among the winners, two of the TL20stocks to consider, Block and Universal Display, surged in late trading, as did a favorite of this blog Cambium Networks, as it moves past supply-chain issues.
Among the losers, Twilio, the communications infrastructure cloud company, plunged twenty-two percent, and even worse was IT software maker Atlassian, down twenty-three percent. Both missed expectations.
And a most curious star of the evening was Microchip, maker of microcontrollers, relatively simple kinds of processors that are used in embedded applications of all kinds. The company can’t keep up with demand and results and outlook keep beating, making the Street wonder why this company’s doing so much better than most chip makers.
In each of these cases you see an interesting divide: the winners seem to be relatively immune so far from macroeconomic turmoil, while the losers are being hit by it more and more. I can’t entirely explain the disparity, I’m merely observing it.
First, the winners. Block, formerly known as Square, is Jack Dorsey’s other company, known for the little swipe-reader for credit cards to take payments. The company was rechristened Block as Dorsey last year became enamored of all things blockchain. It also owns TIDAL, the high-def music venture.
Results topped expectations and the stock soared thirteen percent. Block doesn’t forecast, but a lot of the discussion on the conference call was about the company being a little tighter with spending next year.
Asked analyst Tien-Tsin Huang of JPMorgan, “can we get back to operating leverage in 2023?” CFO Amrita Ahuja answered in the affirmative, noting the company has been spending heavily to build new products the past few years but “Our preliminary 2023 plans really significantly moderate those expenses.” That includes “moderating” new hiring, and cutting is spend on advertising that didn’t produce the biggest return-on-investment.
Interesting to me was that the call didn’t involve much discussion of the economy, which is surprising given that Block serves numerous small businesses. Ahuja noted that so far, in the U.S., the company is seeing “stability” in its customer base across different industries.
Cambium is a fascinating wireless networking provider. Its showcase technology at the moment are wireless access points and switches that enterprises and small service providers can use with the rather new WiFi 6 standard to provide hundreds of megabits per second of wireless networking either inside an office or over several kilometers of a campus or city environment.
Shares surged seven percent in late trading as revenue came in nine percent above expectations, very healthy upside. The company had a really rough second half of 2021 and beginning of 2022, as it struggled to get enough parts to assemble its equipment. That is behind the company now, as the supply situation has gotten better and better.
In a Zoom meeting following the report, CEO Atul Bhatnagar told me, “We are well positioned for solid growth in 2023,” meaning revenue growth. The Street is modeling twenty percent revenue growth in 2023. Bhatnagar reiterated a point he made when I spoke with him in August, which is that 2023 is just the start of several years of growth for Cambium, the “knee in the curve” of an “S-curve” of growth, he claims.
Another winner was Universal Display, which makes the basic ingredients of the organic light-emitting diode, or OLED, display technology that is included in smartphone and TV screens. The Street breathed a sigh of relief because the company didn’t have to cut its revenue outlook after having done so the prior quarter. The OLED market has been tough this year with the collapse in the smartphone market, which has lead to only six percent revenue growth for Universal this year, down from twenty-nine percent last year.
On the call, Universal’s CEO, Steven Abramson, was upbeat about what he said will be a huge expansion of OLED use in 2024, as the panel makers who make the OLED screens using Universal’s technology move to larger-size panels.
He cited some data from research firm UBI saying that the uses of OLED for laptops and tablets is going to soar by four hundred percent in the next four years, with almost fifty million units a year produced using OLED in 2027. Oh, and Abramson also said some other market research shows the total OLED market will double in value by 2030 to a hundred billion dollars annually.
That’s rather remarkable given that the OLED market is now twenty years old, by my reckoning. Universal stock rose by nine percent in late trading.
As I said, Microchip, the maker of those embedded microcontrollers, is keeping analysts scratching their heads. The company hasn’t missed expectations in three years. And it hasn’t missed with its revenue forecast in two years. Demand keeps rising for the company’s chips, and CEO GaneshMoorthy told analysts this evening that the company has a growing backlog of chips ordered that it has not been able to supply given supply-chain issues.
And we exited the September quarter with our highest unsupported backlog ever, with unsupported backlog well above the actual net sales we achieved. We are working hard to reduce our unsupported backlog to more manageable levels and expect to do so in the coming quarters, but also expect to remain supply constrained through the rest of 2022 and well into 2023.
Now, that’s a good problem to have, as they say, but analysts are finding it hard to believe business is so good. One analyst, Ambrish Srivastava of BMO Capital, said, "I just can't help asking this question because weakness is rampant, it’s everywhere” among chip companies, “how are you managing the soft landing” that the company seems to be promising.
Moorthy replied that the company is in markets whose products don’t go up and down in terms of demand. “Most of these customers in these end markets are not in volatile markets,” he said. “They’re looking at the long term.” Microchip sells microcontrollers, as I said, things that can go into industrial systems, medical devices, street lamps, etc. It’s so diverse, you could surmise that it’s the kind of stuff that just keeps getting built in good times and bad.
Oh, and another thing: Within Microchips’s backlog, which is at an all-time high right now, half of those orders are “non-cancelable,” said Moorthy. That helps.
It’s interesting Microchip has been so resilient this year because it’s stock has not escaped unscathed: shares are down thirty-two percent this year.
The losers this evening included Atlassian, which is best known in IT circles for JIRA, a software program that manages the trouble tickets when an employee contacts the help desk to say they need a replacement laptop, and those kinds of things.
In the company’s rather engaging shareholder letter, co-CEOs Scott Farquhar and Mike Cannon-Brookes offered, “in the spirit of our ‘open company, no bullshit’ values, let’s start with the topic that’s top of mind for shareholders: macroeconomic impacts.”
Well, the economy is catching up with Atlassian. “Last quarter, we shared that we saw a decrease in the rate of Free instances converting to paid plans. That trend became more pronounced in Q1,” they write. “This quarter, we started to see a slowing in the rate of paid user growth from existing customers.” Not good either.
This story is now familiar among most software companies. Companies of all sizes are seeing these effects of slowing sales, deals taking longer to close. In Atlassian’s case, though, the Street is showing extremely little patience for the matter. I think that’s probably because Atlassian has beaten expectations with its forecast in seven quarters in a row until this one. It’s a shocker to see the sudden downside.
Last but not least, Twilio is a, sort-of, similar story of a breakdown in performance that is surprising. The company has had a flawless record of beating revenue expectations with its reported results every quarter, including this one. However, its ability to forecast revenue has suddenly broken down this year. Thursday was the third quarterly revenue forecast in a row that missed expectations.
The company held its annual analyst day meeting today, so there was hours and hours of presentations by CEO Jeff Lawson about the technology vision and the product, but there were also questions about why the company has been seeing a breakdown in demand for its product.
Lawson replied that the company has been dealing with a “slowdown” in some industries as a result of macroeconomic pressure, such as crypto-currency trading and social media. He said that has recently been spreading to other industries.
Lawson did have one consolation prize: He made the case that because Twilio software is purchased on a “consumption” model, where it can be dialed up or down by the customer as needed, his company’s revenue can be more volatile in either direction.
“As the economy declines, we feel those slow- downs a lot faster,” he said. “And I think, equally, when the economy improves, we would expect to see a faster overall recovery than what some of the subscription folks would field.”
Well, here’s hoping!
Eye of the Storm: Amplitude, Confluent rising, Roku, Qualcomm tumble
Nov 03, 2022
It’s getting rough out there this earnings season, with even fair performance being punished.
ZoomInfo, a software maker that acts as a kind of rolodex in the cloud to aid sales and marketing for prospecting, saw its shares sell off by twenty-nine percent Wednesday, even thought the company beat expectations with its quarterly report Tuesday evening, and with its outlook.
Problem was, ZoomInfo has, on average, offered a revenue forecast that’s five percent higher than expected the preceding five quarterly reports. This time around, it offered a forecast that was just a fraction of one percent higher, $300 million versus the consensus $298 million. Not good enough.
People are slicing things very thin at this point. The upside-down result of that is that some companies that miss expectations are seeing their shares respond favorably, if the miss wasn’t bad enough, while others are selling off whole-hog if their upside isn’t good enough, like ZoomInfo. Not entirely surprising in a market that is extremely skittish, the Nasdaq Composite Index dropping over three percent on Wednesday.
Wednesday evening, it was cybersecurity vendor Fortinet’s turn to be punished despite solid results. The company beat expectations for profit and revenue and also forecast this quarter higher.
But “billings,” which is one of The Metrics, those non-GAAP measures the Street uses as an extra hint about how things are going — money that has been collected but not yet recognized as revenue — was only a fraction of one percent higher than expected. That’s the smallest upside in billings in years. Hence, the stock sold off eleven percent in after-hours.
This is what I mean: the Street is terribly anxious and is punishing the slightest slip-up.
The relative bright spots Wednesday were Amplitude, the maker of analytics software to tell programmers if their programs are being used successfully; and Confluent, the maker of middleware known in open-source circles as Kafka, for real-time, streaming enterprise data.
Amplitude shares rose four percent after-hours, while confluent jumped by nine percent. Mind you, both stocks had sold off sharply during the regular session amidst the general carnage Wednesday.
Amplitude makes tools for software developers that help them understand how their apps are being used, and how the apps could be better if a developer made adjustments.
I had a chance to talk with Spenser Skates, co-founder and CEO of Amplitude, right after the release came out. He was upbeat about his company’s performance, but also indicated that things are going to be tough all around going forward in software land.
“Macro is hitting everyone,” Skates says. “First, it was companies’ sales cycles elongating,” he says, meaning, customers were taking longer to sign a purchase. “Now, it’s customers re-prioritizing their spend.”
“For sure, the next few quarters are going to be tough for every single company out there.”
Skates was encouraged that Amplitude’s own software programs are still the kinds of programs companies will buy even in tough times.
“It was a record quarter for us in terms of new business signing up for Amplitude, the most new businesses ever, and we did an eight-figure deal,” says Skates, the first time the company has gotten eight figures for a deal. He wouldn’t disclose the company’s name but noted it is a major technology company but not the kind that is immediately associated with aggressive technology use in its own operations.
The pace of signing up new business tells Skates that “In spite of the macro [economic conditions], we are still a must-buy for a lot of teams.”
“Product and data investments tend still to be at the top of the list for companies, unlike a lot of marketing-tech or sales software.”
However, Skates notes that his customers are asking him to work with them on time frames, that they may want to trim some spend in a given period, he said, to put off a portion of their purchase to a later date.
“One thing we have seen from our customers is that there is a segment of customers that had expected some growth this year that’s not materializing for them, and so they’d like to right-size their contract with us — we’re trying to help them solve that in a number of ways.”
He said Amplitude is trying to be extra transparent in showing the customer what they are paying for. “The good thing is, we haven’t seen anyone switch out to a competitor,” he said. “Every customer I’ve talked to, they plan to triple their revenue growth the next seven our eight years, and they see growing with us, but this quarter, they need some help.”
In other developments, Skates boasted of Amplitude’s having added thirty new product features in the quarter to its software, “to keep up the pace of innovation, which is something that’s really hard for companies as they scale.”
A measure of product health, he noted was that the company’s total “ARR,” or annualized recurring revenue, for its two newer products, this quarter totaled ten million dollars. That is a small amount when set against annual revenue of perhaps two hundred and thirty million. Still, Skates tells me it is a “milestone” because it shows the company can add additional programs and is “not a one-trick pony.”
“For sure, the next few quarters are going to be tough for every single company out there,” says Amplitude’s co-founder and CEO Spenser Skates of the landscape for selling software. Despite his customers’ belt-tightening, he is reassured by the fact that “we are still a must-buy for a lot of teams.”
Confluent was also talking about The Macro this evening. During its conference call, the company got ahead of the 2023 speculation by offering a preliminary forecast for revenue next year in a range of $760 million to $770 million, which is ahead of the current consensus for $763 million. The company’s CFO, Steffan Tomlinson, told analysts on the call “we're assuming that the overall macro dynamic that we see today will continue to persist throughout next year.”
An interesting case this evening was Hubspot, a vendor of sales and market software, and a member of the TL20list of stocks to consider. Hubspot had a kind-of inverse of the Fortinet Effect: it beat expectations but its forecast for revenue came in below consensus, and the stock still surged by twelve percent in late trading.
The reason is the stock didn’t sell of is that Hubspot is being given a pass because its revenue forecast is hampered by the rising U.S. dollar. Investors have gotten used to the dollar’s deleterious effect and so they look aside at it. The prior two forecasts Hubspot offered were also lower-than-expected for that reason. Given that management Wednesday evening told analysts the same sad story of software sales taking longer to close, Hubspot gets a pass for dealing with adversity with, you could say, fortitude.
Companies that did not do as well Wednesday were Qualcomm, another TL20 name, and Roku, the purveyor of media programming software and advertising services for interactive TV.
While Qualcomm beat expectations, its forecast for revenue was lower than expected, the second quarter in a row of disappointing forecasting. On the call this evening, CEO Cristiano Amon talked about what he characterized as “short-term challenges,” remarking that “as we look to fiscal 2023, further deterioration of the macroeconomic environment and extended China COVID restrictions have resulted in demand weakness and temporary elevated channel inventory across the industry.”
In particular, the company’s smartphone customers are selling even fewer units of phones than already diminished expectations, and they’re scrambling to reduce their inventory of chips as a result. This is what we’ve heard from companies such as Advanced Micro Devices and others in the semiconductor industry, the great inventory challenge.
Qualcomm is instituting a hiring freeze, CFO Akash Palkhiwala told analysts. Qualcomm stock dropped almost eight percent after-hours.
Roku, of course, has been dealing with the breakdown of the advertising market this year, as well as supply-chain problems that hit sales of television sets, which hampers Roku’s ability to sign up subscribers.
The company beat the revenue number tonight, but the revenue outlook missed and the stock dropped eighteen percent. Founder and CEO Anthony Wood told the Street that macroeconomic pressure on advertising and consumer spending is going to get worse this quarter, and Roku is doing even more now to curtail expense growth.
And Roku’s CFO of eight years, Steve Louden, plans to step down next year after finding his replacement. That’s too bad, I have enjoyed on many occasions interviewing Louden. Wood pointed out that Louden had been planning to move on three years ago but delayed that during the pandemic.
Wolfspeed makes its case for a ton of financing
Nov 02, 2022
It was standing room only Monday morning in a ballroom upstairs at the New York Stock Exchange. I crowded in with about a hundred people to hear management of Wolfspeed make the case for raising a lot of money to advance the semiconductor technology known as silicon carbide.
Silicon carbide, which I covered in a longish piece in February, is a semiconductor that is key to electric vehicles. Tesla started the use of SiC, as it’s called, and all the other carmakers are following suit. Wolfspeed is one of the few chip makers on the planet that can make the stuff, which is more complex than plain-old silicon, more of an art. And Wolfspeed is definitely in the pole position in SiC at this point.
The highlight of the morning was an appearance on video hook-up by Thierry Bolloré, CEO of Jaguar Land Rover. Jaguar has struck a partnership with Wolfspeed to secure supply of SiC for years into the future in order to go all-electric with its vehicles.
The morning was fascinating, both because it offered lots of great detail about how Wolfspeed’s business will progress, but also because there was some controversy.
Investors and analysts are a tad unnerved at the moment because the reality is coming home to them that it is going to take a lot of capital to make all the SiC that Wolfspeed can sell.
That’s not a bad thing: real technology that advances whole industries costs real money. But Wolfspeed stock has spent most of the last few years simply being rewarded because of high demand for SiC. Now, suddenly, it’s as if the waiter has come with the bill.
Specifically, Wolfspeed’s CEO, Gregg Lowe, and the company’s CFO, Neill Reynolds, told the audience that they need to be able to cover six and a half billion dollars worth of capital expenses over the next several yearsto expand their factories.
Silicon carbide “may be the largest single growth of any technology in the history of semiconductors,” says Gregg Lowe, Wolfspeed’s CEO. It’s also going to take a balancing act of financing and factory building that investors are having to get their heads around.
The two had previously mentioned the need for capital, most recently last Thursday when Wolfspeed reported earnings. But Monday was the first time they disclosed a dollar amount.
As Reynolds explained it, the company has expectations for how many products its SiC will be designed into, a total “pipeline” worth eighteen billion dollars through 2027. That gives the company a pretty good line of sight to four billion dollars in annual revenue in 2027, up from one billion this fiscal year. And to go after all that requires almost seven billion in capital investment between now and then.
As Reynolds explained:
Now, right now, what we see is, as you look at that demand and that design-in curve, a lot of opportunity for a while. So, we'll need that supply. So, the plan is to continue to tool those factories out as fast as possible. But we’re going to have a fixed-cost investment in facilities, and then we'll also see tooling out those facilities over time as we drive up to $4 billion of revenue by 2027.
So, what do you get for that? So, overall, I've said it many times, it's 2:1 CapEx to revenue ratio, and what does that mean? That equates to roughly a $6.5 billion investment over the next five years, four to five years, but what do you get for it? You're getting the largest and the top state-of-the-art automated 200-millimeter silicon carbide footprint in the world. And that footprint is here to serve the industry's top customers.
What that means for investors is that the company’s ability to get to positive free cash flow won’t come for another three to four years. Wolfspeed had negative free cash flow of six hundred million dollars in the past twelve months, and has been bleeding cash like this for three years, ever since the push to be a SiC powerhouse.
Investors are now waking up to the fact they’ll have to live with that cash drain a few more years.
Reynolds and Lowe emphasized that come 2026, all that investment will start to turn into a gusher of positive cash:
Over time, as we build out these facilities with this great operating cash flow capability, you'll start to see the operating cash flow pick up over time, getting to over $1.3 billion by 2027. And if you look at the right-hand side of the chart, as we make investments, we'll see a decrease in the free cash flow and sort of making that transition to free cash flow positive out in 2026 and 2027 as we really start to see the benefit of these large automated facilities taking hold.
Here’s the gist of it all in a single slide:
It was clear the disclosure of big funding needs really weighed on the audience. Lowe and Reynolds, after spending hours discussing the business Monday morning with the help of team members who gave excellent technical presentations, spent a good deal of the Q&A portion fielding questions about the money.
You could feel there was a certain tense quality to it. I think some investors were frankly shocked that Wolfspeed now looks to be a more capital-intensive kind of business for longer than investors have been expecting.
Part of the shock is that Lowe and Reynolds have not yet decided which means of financing they will choose, though they talked about their options. Those options include funding from the government, which is a very real prospect given that there is a push to “on-shore” production, the CHIPS Act, etc. Wolfspeed got several hundred million dollars from New York State to build their new factory in the Mohawk Valley in upstate New York, for example.
Other financing options include having customers fund some of the expense years before product is ready; having private sources lend money or take a share of specific projects; and last, tapping public markets for debt or equity.
The last of these, levering up or doing dilutive equity raises, is the least desirable, and Lowe and Reynolds were careful to emphasize multiple times that they will only seek dilutive deals as a last resort. Still, I think a lot of investors can’t help feeling shaken at the mere mention of dilutive financing.
Analyst Harsh Kumar of Piper Jaffray told Lowe and Reynolds that the question of how the company will get financing was “the only question that I'm getting in the last four days on your company.” Does Wolfspeed even have access to that much financing? he demanded to know.
Reynolds told Kumar, ”There is access to capital in these areas,” and offered a bunch of thoughts about different ways the company can go. It still was rather open-ended, not definitive.
Bottom line, Lowe and Reynolds don’t yet have specifics on how they’ll do that financing. And that uncertainty is not welcome at this point. Wolfspeed shares declined by seven percent Monday, worse than the broad market even on a rough day for stocks.
The uncertainty Monday, moreover, follows uncertainty last week that compounded things. Wolfspeed’s Thursday earnings report included a forecast for this quarter that missed expectations, the first miss in a year. The stock plunged eighteen percent the next day. The proximate cause of the shortfall is a delay in availability of some spare parts for Wolfspeed’s factory in Durham, North Carolina. It’s a small thing, in the scheme of things, but investors are in no mood for any slip-ups.
It’s part of a risk-off attitude toward SiC at the moment, after the technology has been a darling for the past year. Notice that Wolfspeed’s top competitor in SiC, On Semiconductor, reported healthy quarterly results Monday, but saw its shares sell off by nine percent. As recently as August, these two companies were a bright spot in a rough semi market, now they can seem to do no right.
There is another point of view, however, a more positive view, and it’s worth considering.
I had an interesting chat following the presentation with Jed Dorsheimer of William Blair, who was also at the meeting. Dorsheimer, who was my main source for my article on SiC in February, has the most comprehensive view on the bullish prospects for SiC and for Wolfspeed and On Semiconductor.
Dorsheimer is a contrarian: rather than worrying about the six-and-a-half billion dollar bill, Dorsheimer asked Lowe why the company is proceeding slowly and not more aggressively with financing.
As Dorsheimer remarked to me, “I am just not sure the capital plans capture the value of being at 200-millimeter versus 150-millimeter, as well as the vertical integration.” Dorsheimer was referring to Wolfspeed’s transition to making chips from eight-inch diameter, or two hundred millimeter, wafers of SiC, a bigger wafer than the industry standard of six inches, or one hundred fifty millimeters.
The reason that is important is because the transition to a bigger wafer is going to bring tremendous economies of scale, says Dorsheimer. As I explained in February, that will lead to great benefits for Wolfspeed’s customers. And Wolfspeed will be the only company in the industry with that larger wafer capability. That transition is “a game changer,” Dorsheimer maintains.
To Dorsheimer — and he made this point in the Q&A session Monday — the move to larger wafers is a bit like chip manufacturing giant Taiwan Semiconductor making chips below ten nanometers in dimension: it’s a breakthrough of epic proportions.
Neill Reynolds, Wolfspeed’s CFO was mobbed after the presentation. For many investors, the big new financing plan is the only thing they can think about.
When Dorsheimer made that point in the meeting, and asked Lowe why the company is pursuing a “crawl, walk, run” approach to financing, Lowe demurred. He obviously understood Dorsheimer’s positive take, but he is not inclined to over-play his hand just yet.
“We're a relatively small company compared to some of the big juggernauts,” observed Lowe. It’s easy for Intel, he noted, to start up multiple factories. Wolfspeed takes a little longer to build things out.
When I asked Dorsheimer what it will take for investors to get comfortable with the big financing ramp, he told me it’s just going to take time for them to see the proof in the pudding, if you will.
“I think it’s all relative,” Dorsheimer says. “Some of these same investors are fine with a twenty-billion-dollar budget for three-nanometer” factories being built by TSM.
"That is because Taiwan Semi has demonstrated operational excellence,” Dorsheimer notes. “Wolfspeed has to demonstrate and build confidence here.”
What will happen, Dorsheimer predicts, is that confidence in Wolfspeed will build as investors see more and more of those investments pay off. What matters, he says, is “the fact that your capital efficiency is as good as it is,” meaning, the kind of cash flow that Reynolds is predicting starting in 2026. That “cash-on-cash return” matches Dorsheimer’s own back-of-the-envelope that he had come up with prior to Monday. It’s a very good return on investment, in his view.
Time is on Wolfspeed’s side, says Dorsheimer. The production of SiC is not just important for Tesla and Jaguar and others, it is a breakthrough material on which much of the planet’s green ambition will depend. “The data suggests there should not be a dollar of capital put in the ground for silicon power electronics vs. silicon carbide,” says Dorsheimer, meaning, plain-old vanilla silicon chips, the kind that have dominated chip-making until now.
Silicon carbide is the future, and for the foreseeable future, Wolfspeed is its greatest apostle.
So what does the stock look like? Wolfspeed is one of the most expensive semiconductor stocks around. Its shares trade for about eight times the next twelve months’ expected revenue. Mind you, that’s less than it traded for six months ago, and less than half the multiple it had a year ago. The question is whether it is a multiple the stock can support now that investors have to reassess what profit looks like for the next several years given the investment curve.
Wolfspeed is still way cheaper than the most expensive chip name, Nvidia, which fetches almost twelve times future revenue.
It may be that after investors have had a chance to digest this week’s unsettling news about financing, they will return to focusing on the positives. As Dorsheimer suggests, the financing is going toward a technology that has already proven to be revolutionary. Wolfspeed is among a handful of companies that can master this technology. Demand is unquestionably very large. And supply is tight all around.
“It may be the largest single growth of any technology in the history of semiconductors,” Lowe said on Monday. “I think the supply is going to be chasing demand probably through the end of this decade,” he added, meaning not just Wolfspeed’s supply, but also what On and anyone else can produce.
And everything about SiC is hard to do, somewhere at the intersection of science and industry. “It takes time,” Wolfspeed’s head of R&D, Elif Balkas, told the audience regarding SiC production. .
“The material itself is very difficult to grow, difficult to process,” said Balkas. “One of our leading scientists said that you have to rush slowly.”
“Silicon carbide, it's a tough thing,” added Lowe. “These things are not for the faint of heart.”
Nor, perhaps, is Wolfspeed stock at this moment in time for the investor who is faint of heart.
Arista delivers stellar results: Customers are ‘desperate’
Nov 01, 2022
Arista Networks, which makes the bulk of its money selling networking equipment to the large data centers of giants Microsoft, Meta, and others, turned in a stellar performance Monday evening, which is interesting considering that both Microsoft and Alphabethad warned last week of slowing use of cloud services by their customers.
The revenue number, $1.18 billion, was eleven percent higher than the consensus $1.06 billion, which is the highest upside in years. Profit also beat handily, by nineteen percent.
The outlook for this quarter looks strong as well, a projected level of revenue about six percent higher than consensus.
These are, mind you, within the context of very strong growth. Sales last quarter were fifty-seven percent higher than a year earlier. The forecast revenue this quarter would be more than forty percent higher.
CFO Ita Brennan noted that a year ago, during the company’s November 2nd analyst day meeting, she had outlined an expectation for thirty percent revenue growth for 2022. With tonight’s higher-than-expected forecast, the total should come in well above that, more than forty-five percent growth, about $4.29 billion.
CEO and co-founder Jayshree Ullal told analysts during the evening’s conference call that the results had a disproportionate amount of what she refers to as the “Cloud Titans,” the very largest cloud companies including Meta and Microsoft.
Arista’s chief operating officer, Anshul Sadana, called out in particular how Meta and Microsoft had been deploying switches from Arista to make use of the highest available optical transmission speeds, four-hundred gigabits per second, which has been a wave that has been coming for several years now and seems finally to be crashing upon the shore, as it were.
Ullal said this is one of the strongest years for the cloud companies as a customer since Arista’s 2014 IPO. They will make up more than forty-five percent of this year’s sales, she said.
Ullal is a straight shooter, I find. When Needham & Co. analyst Alex Henderson asked her if the company can possibly maintain such high rates of growth, she said probably things have to cool at some point.
“It's going to be difficult to sustain 45% growth every year,” she said. “I'd love to have it, but as you know, Arista is a volatile business and you have to think of us across a three to five-year CAGR [compounded annual growth rate], not just on an annual basis.”
Another analyst, George Notter, with Jefferies & Co., wanted to know how the company could be sure that Meta and Microsoft are not stockpiling inventory, which could conceivably hurt Arista’s sales at some point.
Said COO Sadana, “I would say the best sign is the number of phone calls I get or Jayshree gets or others in the company get, when are you shipping? So, these customers are still roughly hand to mouth… customers are desperate.”
Wow, I guess desperate customers is the way you’d like them.
Ullal was asked the macroeconomic question, and confined her remarks to saying that “data center spend has been very strong” and that she expects that to continue. Ullal did note that Europe is one place were there was some weakness sensed among the customer base.
As in prior quarters, Arista continues to use its balance sheet to secure huge purchases of parts well into the future. The supply constraints, said Ullal and Brennan, continue, Arista is not entirely able to meet demand. So, each quarter, the company keeps purchasing years into the future.
Interestingly, Ullal said the company "fully expect to grow double digits next year,” and that even if the supply chain remains constrained, she said, revenue growth will still probably be double digits.
One analyst, Simon Leopold, with Raymond James, said it seems such giant purchases should be pointing to either much bigger sales next year than expected, or a lot of inventory that Arista has to warehouse.
CFO Brennan pushed back on that. “We are thinking about this, kind-of, longer term than just the year that's ahead,” said Brennan. That might mean that some of the parts on order won’t even show up until some time down the road.
It’s a tad obscure, and Brennan advised Leopold to wait till later this week, as Arista will again hold its analyst day meeting on Thursday, and she intimated there will be some further discussion about the whole ordering forward thing.
The only blemish in all of this is that the company’s gross profit margin is lower than some would like. At 61.2% last quarter, it is a few points lower than it had been the last few years.
Ullal, when pressed on the matter, pointed out that Arista has more business now from the cloud giants than it has ever had in its history. Those cloud customers tend to make lower-margin purchases, so they depress Arista’s overall corporate gross profit margin.
And that is, at the end of the day, what I think many investors focused on, that gigantic influence of cloud companies, and two in particular, Meta and Microsoft. Arista’s stock declined fractionally in late trading, which tells me that people were not as thrilled with these stellar results as it seems they should be.
My guess is the balance of opinion thinks that Meta and Microsoft have to stop buying so much gear at some point, and that it will torpedo Arista’s results. That’s despite the fact that Ullal remarked on the call that Arista has better “visibility” than ever, meaning, the company’s ability to calculate when it will get orders from its customers.
So, good report, nervous environment. I will point out that when I interviewed Ullal back in May, she told me the company’s now enjoying a period of growth akin to a prior growth spurt, in 2018. However, sales this year of better than forty-five percent would be quite a bit higher than the thirty-one percent rate of growth in 2018. So far, she’s doing exactly what she told me she would.
What may be eluding investors, and giving them anxiety about having so much depend on Microsoft and Meta, is that investors see those customers’ demand as transitory, but Ullal contends it is strategic, meaning, it extends over many years because those giant companies have to invest, they can’t stop.
When Mark Zuckerberglast week talked about a staggering $101 billion in operating expenses he expects in 2023 for Meta, he also noted that Meta is plowing capital spending into its data center infrastructure. Meta is building the Metaverse, after all.
So, either investors are right, and Microsoft and Meta are a house of cards that will come apart next year for Arista, or Ullal is right that these companies are desperate for equipment for years into the future … or the truth is somewhere in between.
Arista stock is down sixteen percent this year, and the stock is up nineteen percent since I picked it for the TL20 in July, making it the second-best performing stock of the group following Pure Storage.
The TL podcast for October 30th: A good week for stocks, bad week for earnings
Oct 30, 2022
Apple was the saving grace, why can’t Alphabet or Microsoft say anything about the economy, and looking ahead to a big day for silicon carbide on Monday.
DriveNets: A challenge to Cisco that’s intriguing
Oct 29, 2022
“Ten years from now, this industry will not look like it does today,” says DriveNets’s Ido Susan, with the supreme conviction a Cisco challenger must maintain. “The train has already left the station.”
The field of computer networking is one of the last bastions of unperturbed intellectual property.
Personal computers and servers long ago became a commodity battle between Dell and HP and Lenovo. And data storage, collections of disk drives, also became something of a commodity, a resource you can rent in the cloud for pennies.
Computer networking, however, has never been commoditized. Networking equipment, most of it sold by the giant, Cisco Systems, is an integrated combination of hardware and software that defies commoditization.
While most of computing is like a McMansion, Cisco gear is like European cathedrals: traditional, mysterious and unassailable. No challenger has ever displaced Cisco’s richest franchise, the network “router,” the most complex and therefore the most valuable part of networking.
That may be changing. For the first time in a long time, Cisco’s routing fiefdom appears vulnerable.
“We have a big, big mission: to go and transform networking,” says Ido Susan, the CEO and co-founder of a startup based near Tel Aviv named DriveNets.
DriveNets has received nearly six hundred million dollars in venture capital to assail the Cisco routing fiefdom. I talked recently with Susan via Zoom, he in Ra’anana, twenty minutes from Tel Aviv, and I in New York.
Although this blog is about public equity, it can be worth listening to promising startups. They may be the next great IPO, and they can tell you interesting things about how technology is changing.
Challenges to Cisco are always a fascinating development. Cisco’s greatest challenger, Arista Networks, emulated Cisco’s business plan, building an integrated hardware and software device, but in another product category, network switches, a simpler part of networking.
Many smaller challengers have come and gone, such as Nicira, bought by VMware; BigSwitch, bought by Arista; and Cumulous Networks, bought by Mellanox (which was in turn bought by Nvidia). They all had promise, and amounted to very little.
The only companies that have made a dent in the Cisco routing empire are Juniper Networks and Nokia, and they mostly play second and third fiddle. Neither has changed the fundamental nature of routing, that close integration of hardware and software.
Funny enough, it’s often the case that challengers to Cisco come from Cisco itself, as is the case for Susan.
“I always joke that my education came from Cisco,” says Susan. Though he grew up on a kibbutz in Israel, he didn’t finish his studies there. Instead, he became an entrepreneur, and at only twenty-six years old, sold his first startup, called IntuCell, to Cisco for half a billion dollars in 2013. He stayed on for two and a half years.
"It’s the best school that you can ask for,” says Susan of Cisco, especially to learn first hand from the sales “machine,” he says. “Just amazing people.”
Like all challengers, Susan’s admiration for his alma mater is mixed with a sense of the inevitable, the belief that, ultimately, progress will crack the Cisco edifice.
“Ten years from now, this industry will not look like it does today,” says Susan. “The train has already left the station.”
What DriveNets sells is a piece of software, a network operating system, that can run on computer hardware from any number of vendors the same way Microsoft Windows can run on Dell or HP or Lenovo PCs and servers.
A router can be thought of as a computer that’s dedicated to running only one program. It continually runs an algorithm that calculates, at any moment in time, the best of several available network links to send a piece of data from one computer to another. You could say it’s a traffic control system for data.
The kind of routers DriveNets is seeking to displace are the biggest, most expensive that Cisco sells, known as “core” routers. As the most powerful routers around, they’re not designed for connecting PCs. They’re built to direct the enormous rivers of billions of packets of data that flow across the backbone of the Internet.
Core routers are bought by telephone companies, cable operators and other services providers whose business is running the fiber-optic lines that make up that backbone. As such, the core router is one of the most sensitive pieces of equipment in all of networking. It is the brains of it all. If a router breaks, whole swaths of the Internet can go dark and Web sites can disappear.
That is, in fact, what happened to Facebook a year ago. A routine update to Facebook’s core routers’ software caused the routers to stop “advertising” the presence of Facebook on the Internet. For seven hours, nobody could find Facebook. In a bizarre example of how everything is connected, Facebook’s own employees were shut out of their physical offices as their I.D. badges stopped functioning.
Given the stakes, it is no small feat that DriveNets already has won AT&T as a customer for its router software. It is working with, or in talks with, almost one hundred other service providers that Susan declined to name. It helps that Susan knows the market. The company he sold to Cisco, IntuCell, made software to help telcos manage cellular networks.
The inspiration for DriveNets is the conviction that Cisco’s routers must be “opened up” the way every other part of computing has opened up.
“We saw all the transformation that happened on the compute side,” says Susan. “The mainframe of IBM moved to [Intel] x86 [processors], and then it moved to VMware virtualization, and put everything in the cloud.” Likewise, data storage had at one time been exclusively “the hardware of EMC” but got opened up by “software-defined storage.”
Not so for networking. “Networking is still the mainframe of the world,” says Susan.
To open up networking, DriveNets has built what you could think of as a new kind of VMware for a cloud computing era.
“We designed it from the beginning as cloud-native, we built everything from scratch,” he says.
The DriveNets routing software lives inside what’s called a “container,” a small bundle of software code that has all the things it needs and is self-sufficient like a single-celled organism. That bundle exists with thousands of other bundles in each networking hardware box, like highly productive bacteria in a host body.
All those “virtual” routers share resources to make the most of every piece of hardware. In this way, networking becomes as efficient and flexible as cloud computing and storage.
A number of benefits can ensue. Networking becomes an "as-a-service model,” whereby the consumer of networking can buy it “self-service and on demand,” explains Susan.
Instead of carefully planning and procuring equipment, a carrier’s network becomes a simple matter of turning on more software containers the way Amazon turns on virtual x86 servers.
Disasters like the Facebook crash need never happen. Instead of upgrading a router while it’s running, a service provider can just turn on a replacement router container with the right settings and switch over.
As good as all that sounds, selling the vision has been an uphill battle.
“It’s a very conservative industry,” observes Susan of AT&T and other operators. When DriveNets first started, he says, the incumbents, meaning Cisco and Nokia and Juniper, exploited that conservatism. Like the days of IBM’s mainframe, “nobody ever got fired for buying Cisco,” and customers were warned to think twice about switching.
“They [Cisco] would say, networking is not like compute and storage, you can’t convert it to a software base,” Susan recalls. “They would say, you can’t separate routing from the underlying physical network hardware,” the fiber-optic transceivers that Cisco sells as part of its machines.
“Yes, and no,” says Susan. It is true, he says, that when DriveNets started, six years ago, x86 server hardware was too primitive. It had to be over-engineered to have the necessary horsepower for routing.
“We had good product, but each server cost $20,000 because that hardware was designed for computers, not for networking,” he explains.
Things began to change as efforts coalesced in the industry to find broad alternatives to Cisco’s fiefdom. Those efforts include The Open Compute Project, a consortium backed by Facebook to advance a general hardware standard.
OCP fostered a new crop of hardware from “original device manufacturers,” or ODMs. Those are the little-known companies, mostly in Asia, that for decades have assembled PCs on behalf of Dell and HP and Lenovo. Companies such as Edge-CoreNetworks, a subsidiary of Accton Technology, and privately held UfiSpace, both Taiwan-based firms, learned to build an economical kind of network box. They benefitted from a ready supply of networking chips from Broadcom, the dominant vendor of networking silicon.
As commodity network hardware emerged, the DriveNets software began to make more sense. The initial ODM hardware was designed only for the simpler network switches. DriveNets found its entry point.
“We came to them [the ODMs]” with the DriveNets software, he says, “and we told them, Guys, let's complete the portfolio, let's make it available for routing as well.”
Other parts of networking began to open up as things snowballed. New standards for fiber-optic transceivers emerged that let DriveNets and others offer the same connections that Cisco traditionally bundled, thereby offering carriers greater choice of physical infrastructure.
As the right balance of hardware and software surfaced, the culture of caution at AT&T and other service providers began to relax.
“It's a big, big change for them,” says Susan, “but once they get it, they see the benefit from cost reduction and the benefit from time to market, and new services to generate new revenue.”
Those benefits include things like having networking that can be dialed up or down the same way Amazon can simply “spin up” more compute or storage.
“A carrier may have a peak during the day for mobile traffic, and a peak during the evening for home broadband.” Traditionally, those shifting patterns of usage were a nightmare for carriers from a capacity planning standpoint.
With the greater flexibility of numerous virtual routers, those different times of the day could be accommodated by simply expanding and contracting a dynamic network. "They can all be one peak on the same infrastructure,” as Susan puts it.
Another headache for carriers, inventory, becomes streamlined by using commodity boxes. “Look, today they have Cisco 6000 and 9000, and Juniper MX, and PTX,” the various models of core routers. “And spares,” says Susan, meaning, the shelves of spare parts a carrier has to keep around for each distinct model of router.
If, on the other hand, a commodity network brick dies, a service provider just swaps it out for an identical unit the same way Amazon and Google swap in and out identical servers.
What can emerge are potentially significant cost savings across a phone company’s network. “You have tens or hundreds of routers in every location, and a carrier may have 4,000 locations in the U.S. alone,” observes Susan, “and they’re constantly adding capacity as the Internet grows.”
The carriers, says Susan, have now gone from cautious to more demanding, he says. “They’re now coming and saying [to Cisco and Juniper and Nokia], ‘I don’t want any more of your chassis with line card,” meaning, the traditional cathedral of integrated hardware, “I want diversity and multiple options.”
Cisco has changed its tune, he says. Instead of saying it can’t be done, it is now promising to unbundle and sell the way DriveNets does. Susan’s response to that is to say, “We’re already there.”
“We have hundreds of petabytes running through our solution every day,” he says, “we have multiple Tier One [service providers] in the U.S. and Japan.”
DriveNets has been selling now for four years. Susan declines to disclose revenue other than to say it’s doubling annually and “in the high tens of millions of dollars.”
There are competing software startups, including Kaloom of Montreal, and Pluribus Networks of San Jose, which was acquired by Arista in August for undisclosed terms. DriveNets appears to have received multiples of the amount of financing as either of those two.
“We don’t compete with small startups, we are fighting against incumbents,” insists Susan.
The market for core routing for service providers is twelve billion dollars, annually, according to Susan. If one adds AWS and Google and Azure and other service providers that also buy routers, it’s more like twenty billion. “It’s big enough to build something big,” in his view.
“You have the early adopters that are already working with us,” he says, “and then you have the second wave that are adopting more and more, and we are there right now, and the rest will follow."
Again, for Susan, there is something of an inevitability in the zeitgeist, if you will.
“Traffic is growing so fast and the revenue is not growing in the same percentage” for carriers, he says. “They need to find a way to monetize the network over the assets that they have, how they can utilize shared infrastructure, and all the power consumption and the rack space.”
At this moment in a startup’s journey, the question naturally arises: Can a challenger, however well funded, go the distance against the incumbent if the incumbent adapts quickly?
DriveNets is a big startup, at five hundred employees, in numerous time zones. Still, “our challenge now is scaling,” says Susan. That pertains especially to the challenge of creating a sales and support regime the way Cisco has for decades. However promising a startup is, that is daunting.
“I asked Diane Greene,” the founder of VMware, “why did you sell the company” to EMC, he recalls. “And she told me, the technology was ready, but there was a big, big challenge to scale it up, to build a sales force, and you need an army to do that.”
Hence, good startups always face the prospect of selling out.
“If somebody came with a big check and it would make sense for my employees and investors, and it would make our technology more successful like VMware did, it would be hard to say no,” he concedes.
For the moment, everything suggests staying the course. “I built the company to be a big company,” he says. “We are putting great talent on the ground and adding more logos every quarter, the pipeline [of business] is very impressive, and we have a lot of money in the bank."
Long before toppling Cisco, Susan expects to cross paths with another challenger, Arista.
“Arista is a great company,” he says, “I hope I will be as successful as Arista.” It will take a few years, but "we will see them in the cloud,” says Susan. “We will develop a switching capability, and they will develop a routing capability, it’s just a matter of time.”
Though many have tried and failed, the challenge to Cisco from DriveNets seems to me different from past efforts. The rise of cloud computing feels like it could be the end of one era in networking and the beginning of a new one.
Whether it’s DriveNets or another firm that cracks the code, the most compelling question is what happens if there is a massive sea-change in networking.
If the routing market blows open and becomes like all of cloud computing, we could see a world of new kinds of networks and new kinds of network applications. It has happened in computing with the rise of startups such as Databricks and Snowflake. They came about precisely because computing was no longer trapped in a single box. Their arrival is leading to a vibrant new market for database software, as I’ve written. Change in one area can spark change in another area.
Given the breadth and reach of networking as a category of technology, the changes this time around could be quite profound.
Amazon’s sales outlook the biggest forecast miss on record
Oct 27, 2022
Update:
On the call this evening with analysts, Amazon’s CFO, Brian Olsavsky, confirmed that the company’s outlook for this quarter assumes consumers are going to be spending less than normal during the holidays.
“We're very optimistic about the holiday, but we're realistic that there's various factors weighing on people's wallets,” said Olsavsky, “and we're not quite sure how strong holiday spending will be versus last year.”
Olsavsky said the company was seeing sales of consumer goods slowing markedly toward the end of the quarter, especially overseas.
“It was mostly in international we saw the biggest impact,” he said. “And we think that is tied to a tougher recessionary environment there, even if you compare it to the US, it's worse in Europe right now; the Ukraine war and the energy price issues have really compounded in that geography.”
What was a big negative surprise on the call was that Olsavsky said not only are consumer sales slowing, but AWS, the Web services unit, saw its rate of growth slow at quarter’s end. Olsavsky says AWS customers are tightening their belts, and not fully spending the amounts they were committed to spend by contract as quickly as they had expected.
“There are some industries that have lower demand […] things like financial services, the mortgage business being down, cryptocurrencies has been down.”
“And I think everyone is just cautious, and they want to again watch their spend,” said Olsavsky, “and, as CFO, I appreciate that, and we're doing the same thing here at Amazon.”
Yeesh, that’s not good news for the market for enteprise software sales.
Not only is revenue at risk, but Amazon is seeing its own costs rise for energy, for labor, and everything else.
“We have seen inflation in our wages this year and particularly on our tech employees, and it's heavily concentrated in AWS.
“We're also seeing energy costs that are materially higher than they had been pre-pandemic, electricity and the impact of natural gas pricing, so those prices are up more than 2x over the last couple of years.”
Amazon shares regain a little bit of ground following the call, now down just thirteen percent.
Previously:
It’s a rough week for MegaCap tech, as I had anticipated, but tonight’s report from Amazon is one for the record books.
The forecast for this quarter’s expected revenue is not just weak, it’s a new record.
Going back sixteen and a half years, as far as FactSet will take me back, the gap between Amazon’s forecast for sales and the Street consensus, $144 billion versus $155 billion expected, is the largest on record at nine percent. It’s worse even than the company’s miss in October of 2008, the depths of the Great Recession. That was only seven percent.
Amazon went public in May of 1997, and I’m pretty sure they were forecasting back in the 1990s and early Naughts, but FactSet only goes back to 2006, so, we’ll have to just be content with that.
The forecast miss follows third quarter results that were just about in line with expectations.
Amazon shares are down sixteen percent in late trading.
The bulk of the shortfall, both for the reported revenue and for the outlook on sales, is the rising U.S. dollar. As Amazon converts sales overseas to dollars, the stronger dollar reduces what Amazon ends up with from Euros and British pounds and other currencies. That’s a problem everyone faces.
The dollar sucked five billion dollars out of last quarter’s revenue, Amazon said. And for the forecast this quarter, the dollar is expected to suck another 4.6 percentage points of growth from the revenue number.
If you add back those five percentage points of growth, you would get a currency-neutral growth rate of 9.6%. That would lead to revenue of $150.5 billion. That would still be short of consensus by three percent, but much less dramatic than the nine percent miss.
So, you can surmise that Amazon is seeing some of what it expects to be a weaker holiday season than the Street has been modeling.
Amazon’s CEO, Andy Jassy, emphasized in the press release that Amazon is prepared to take many steps to streamline costs to deal with the current economic environment.
“There is obviously a lot happening in the macroeconomic environment,” said Jassy. “And we’ll balance our investments to be more streamlined without compromising our key long-term, strategic bets.”
Meta’s expenses soar in the Metaverse as sales slump in the Actual-verse
Oct 27, 2022
Explaining to investors why The Metaverse is worth tens of billions of dollars in expenses every year is proving a tough sell for Zuckerberg.
Shares of Meta Properties plunged by twenty percent in Wednesday’s after-hours session as founder and CEO Mark Zuckerberg told analysts how he will increase spending next year at a sharp clip even as the company is starved for growth.
Zuckerberg's comments came as Meta’s third quarter report after the closing bell delivered slightly higher-than-expected revenue, and profit per share merely in line with expectations.
The tension going on in the business could not be more obvious: Revenue is barely growing while Meta’s investment in The Metaverse in its Reality Labs division is burning almost four billion dollars per quarter while producing just a few hundred million in revenue.
The company told the Street to expect next year’s Reality Labs expenses to grow “significantly.”
In 2023, said Zuckerberg, Meta plans to spend as much as fifteen percent more on cost of goods — expensive Metaverse goggles — and on operating expenses to hire Metaverse engineers, spending that may total as much as $101 billion. And that’s without giving any indication of what revenue for the year may be. (The Street consensus is that revenue will rise by eight percent).
Moreover, capital expenses are set to soar as well, likely crimping free cash flow, as the company adds infrastructure to build more and more artificialintelligence throughout its products.
Said outgoing CFO David Warner, “There is some increased capital intensity that comes with moving more of our infrastructure to AI; it requires more expensive servers and networking equipment, and we are building new data centers specifically equipped to support next-generation AI hardware.”
One analyst on Wednesday’s call, Brent Thill, summed up the frustration on the part of investors at seeing Meta spend more and more with no clear explanation of payoff.
“I think, kind-of, summing up how investors are feeling right now is that there are just too many experimental bets versus proven bets in the core,” offered Thill, “and I'm curious if you can just add more color why you don't feel these are experimental, you feel like they pay off […] everyone would love to hear why you think this pays off.”
What came back from Zuckerberg and team was vague.
“I think a lot of the things that we're working on across the Family of Apps are, we're quite confident that they're going to work and be good,” said Zuckerberg.
“We can't tell you right now how much – how big they're going to scale to be, but I think that each of these things are, kind-of, going in the right direction.”
Not exactly a rousing summation. Meantime, those covering Meta for years are worried the company is not protecting its flank, its ability to sell ads.
Mark Mahaney of Evercore ISI, noting how Apple’s changes to tracking data have hurt Meta, remarked, “This is something that took $10 billion, maybe, out of your business, I mean, it had a material financial impact.
“And listening to the call, I just don't hear it as a major investment priority,” said Mahoney. “The question is, is it a major investment priority, or is it that goal is just elusive and it's better to focus on other things?”
Meta’s chief business officer, Marne Levine, answered Mahaney with a bunch of product descriptions that didn’t really seem to answer his question.
What you’re hearing in Wednesday evening’s back and forth is what happens when vaporware has to confront investors.
When Zuckerberg announced The Metaverse last year, I wrote that it was vaporware, meaning, a tech product that is hyped way before it event exists.
The Metaverse still doesn’t exist, but Zuckerberg has the difficult task now of selling the hype not just to consumers and developers, but also to investors as he pours more and more money into that non-existent world.
Meantime, as I’ve written before, the real investment angle here are the data center companies that stand to rake in real money off that massive increase in capital spending, namely Arista Networks and Pure Storage, two stocks in the TL20 group of stocks to consider.
Arista shares Thursday night surged by eight percent in late trading while Pure shares rose fractionally.
With tonight’s twenty percent decline, Meta stock is down sixty-nine percent for the year.
Nouriel Roubini, hyper-realist of gloom, foresees the worst
Oct 27, 2022
Like a hyper-realist painter, Roubini crafts a vision of dystopia in which every conceivable threat seems to have equal potential.
If you are looking for a big picture with a totally dispiriting tint, Nouriel Roubini is your man.
Wednesday, Roubini, who acquired the moniker Dr. Doom during the 2009 recession, took part in an hour and a half Zoom chat hosted by the Collective[i] Forecast, a speaker series organized by Collective[i], an AI platform designed to optimize B2B sales. The attendees were a small audience of tech types and journalists, myself included.
It was a rollicking hour and a half swept along by the relentless rush of Roubini’s urgent cataloging of all the ills that make our era sound like the worst period, ever, for the planet.
“I’ve very ambitious,” Roubini told the audience. “I’m thinking about trying to predict not just the course of the global economy, but of our planet.”
I’ve not yet read the book, but Roubini assured us his writing about the ten plagues is “nuanced.” Which is interesting because his long soliloquies during the Zoom chat, delivered as a kind of verbal onslaught, came across not so much as nuanced but rather sweeping and vivid, like an Hieronymous Bosch triptych of hell.
A better analogy might be the hyper-realist school of painting of the 1960s, such as the works of Richard Estes. The delight in Estes’s paintings, which have always mesmerized me, is that every single detail of a scene, such as a candy shop window, is given equal weight. The sheer agglomeration of detail in the hyper-realist aesthetic is so massive that its totality both stuns and becalms the viewer.
Likewise, Roubini, in fifteen-minute bursts, hammered home his ten themes for the future like Homer chronicling a downfall that had already come to pass.
The big picture is a “regime change” globally, he said. Things that are emerging now are new kinds of threats on a bigger scale than in past.
“I never, never worried about nuclear war” growing up in the ‘70s, said Roubini, who is in his mid-sixties. Now, he’s been talking with people in Washington, D.C. and, “some people worry that World War III has already, effectively, started.”
Likewise, “I never heard about the term global pandemic,” he said, leaving aside HIV in the ‘80s. Nor had he heard of climate change. “There’s a link between ecological destruction and destruction of animals,” said Roubini. Not to mention, “the release of ancient bacteria” by melting ice caps that could lead to new kinds of global plagues.
“I never heard about AI, machine learning, robots, automation, destroying most jobs.”
“I never heard about debt crisis, at least in advance economies, because debt ratio to GDP, public or private, was always very low.” Nor had he heard about “implicit” debt from pension systems and social safety nets without support and an aging population.
Now, he says, he foresees multiple economic shocks that will lead to “stagflation” worst than the ‘70s, and an implosion of record levels of debt, both public and private.
“It [debt] used to be a hundred percent of GDP in 1970, two hundred percent in 2000, and today it’s three hundred and fifty percent and rising.” It’s also higher, proportionately, in the U.S. than it was during the Great Depression, he noted.
“There is a time bomb of debt,” said Roubini.
“Zombies, corporates, firms, banks, shadow banks, government, country, household, are going to go bust this time around — the mother of all debt crises.”
The forces are so great, said Roubini, that “we may be at the point where eventually, homo sapiens is going to disappear.”
It is the totality of calamity that overwhelms the listener, like having your fortune told by a tarot-card reader dealing all the worst hands possible: slipping in the shower and breaking your collar bone at the very same moment a burglar is breaking into your house while a gas leak is killing you entire family just as a car runs over your puppy on the front lawn and mere moments before a marauding horde of locust consume the neighborhood.
When it came time for audience members to venture a question, many seemed pinned to the floor, awed and desperate to know, What can I do about it all?
Roubini’s council seemed just as scary.
“You can prepare individually and you can prepare collectively,” he said. Individuals should look to “have the right skills” and to constantly be in training as some skills get killed off by AI, he said. “Retool yourself, get another degree,” he counseled. “But you have to be thinking: in what way could my job be destroyed by AI, or by an economic and financial crisis.”
On the financial side, said Roubini, look to invest in REITs in parts of the country that will survive climate change — any place but Florida.
“Prepare yourself to live where there is no WiFi” but also no drought or hurricanes, he suggested, which could include parts of Canada.
Collectively, said Roubini, we should look to the lesson of Noah’s ark. “It was a common effort,” observed Roubini. “He didn’t build a massive boat for himself and disappear; he made sure that all of human and animal and plant life was there so that after the flood you can rebuild.”
“I happen to be a Persian Jew born in Turkey who went from Turkey to Iran to Israel to Italy, and then to America,” remarked Roubini. “So, I care about the common good,” not just any one nation in particular.
As artful and passionate as are Roubini’s images, it’s important to step back and remember that in real life, reality is not hyper-reality. While Richard Estes's paintings of shop windows have a hundred things that all matter, in the real world, not all of them matter equally.
Of the ten mega-threats Roubini outlined, some of them probably have a negating effect upon others. The hard part about telling the future, as with chronicling the past, is the supple and mysterious way in which, within the totality of things, some things hang together and others blow apart.
By his own account, Roubini is simply a realist.
“I’m Dr. Realist, I’m not Dr. Doom,” Roubini told the audience. “That’s my job, recognizing some of these threats.”
“Cassandra warned about trying to avoid the thing,” as did the great prophets who tried to avert the destruction of the Kingdom of Israel, he observed.
“I want the utopian future,” said Roubini, “but to me right now, we are on the wrong track; I hope the young people who care about nuclear war, climate change, pandemic, economic, financial disaster, and AI and so on, will work with other people to resolve these problems.”
Informatica CEO: More deals being ‘pushed out’ as customers take a cautious stance
Oct 27, 2022
The watchword of the moment for corporate earnings is uncertainty.
“Right now, the problem is, there’s no single place you can go, there’s no consistent set of information for anybody to look at” regarding the economic challenges, says Amit Walia, CEO of data analytics software maker Informatica.
“So, people are just looking at everything, and there’s an element of cautionary conservatism.”
I was talking with Walia via Zoom regarding Informatica’s third-quarter report put out Wednesday. The report was mixed, with some important metrics turning out better than expected, but also revenue and profit coming in lower than expected.
Uncertainty is what Walia’s customers are dealing with, echoing the vague remarks Tuesday night by Microsoft and Alphabet. The possible tilt toward global recession feels ominous but hard for anyone to get their hands around.
Informatica’s quarterly report, its fifth report since coming public in 2021, was the first time the company missed its own revenue forecast. The company’s forecast for this quarter’s revenue also missed Street consensus.
Informatica shares declined one percent in late trading.
Most of the miss was a consequence of the rising U.S. dollar, which depresses reported revenue. That’s the same thing that hit Microsoft and Alphabet on Tuesday.
But, some of the shortfall is also because customers are taking longer to sign off on software purchases as they grapple with uncertainty.
“We’re seeing more and more of the macro headwinds starting to come into play,” says Walia.
“We’re absolutely seeing deal cycles elongate,” he says. “What would take X amount of time now takes ten or twenty percent longer.”
Informatica is not alone. These delays are a continuation and an increase in the trend for software makers that started to crop up in the previous quarter.
Those delays are not lost sales, Walia tells me; he expects the deals will ultimately be signed. But the delays are going to run into the first quarter of next year.
“I think what we will see is that Q4 will see some of the Q3 deals, but at the same time, Q4 [deals] will move into Q1 or Q2 of next year as well.”
As a consequence of the delays and the continued rise of the U.S. dollar, the forecast for this quarter’s revenue, $398 million to $408 million, is below the average estimate of $424 million.
Does 2023 look like 2008 and 2009, a deep unraveling? I ask Walia.
“Our customers are going to need automation to do more with less” in a time of strained budgets, says Walia. Automation made possible by artificial intelligence is “an area we’re going to push very hard on.”
What’s different this time around from 2009 is that “the shocks are of a very different type,” says Walia, because trouble is not confined to one or two industries such as banking and real estate, as in 2009. “I feel it could be more uniformly challenging” as a result of the broad nature of trouble in the world.
Walia says his customers are as interested as ever in the “digital transformation” projects that the software makes possible, based on numerous visits with customers in recent months.
“The raw desire is there to be doing big data initiatives,” he says. At the same time, “everyone has an element of concern as they walk this quarter and planning for next year.”
“Things have moved so rapidly in the last couple weeks,” says Walia of the global economic situation. “I was in Europe and there’s an element of concern about what happens in winter with gas prices.”
Given all that, “the way to look at 2023 is a very conservative first half, and then a second half that will be relatively okay,” he says, contingent on interest rate drama settling down.
So, what’s the good news? The Metrics, those non-GAAP numbers that the Street uses as a proxy for future growth, all came in well ahead of consensus last quarter.
Those metrics include total “ARR,” or, annualized recurring revenue, a measure of the total value of contracts signed when extrapolating out twelve months into the future. Also better than expected were the ARR from subscription products, and from cloud computing versions of Informatica’s software.
“We are a mature company, we have robust scale, we drive profits and cash flow,” Walia points out, noting that the company’s non-GAAP operating profit came in at the high end of the company’s forecast, and better than the Street was expecting. Free cash flow of $77 million was toward the lower end of expectations, but still a nice twenty-two percent increase, year over year.
As important as profit, the use of the company’s product continues to increase, notes Walia. The company’s cloud computing service processed forty-five trillion transactions in the month of September, almost double the rate a year ago.
"Customers are using more of our platform, a lot more, which shows the stickiness it has,” says Walia. “They’re sticky use cases, they’re mission-critical, and that gives me a lot of conviction and comfort.”
He notes, too, the company’s rate of renewal by its customers is in the “mid-nineties” on a percentage basis, which is “very strong.”
“Even if the economy slows down, I would rather have adoption continue to happen because demand will always come back,” observes Walia. “But the proof of your portfolio is customers are using all of it.”
Informatica is “blessed” to have large customers such as Uber and GM Financial, he says.
To focus on just what Walia and his team can control, he says, Informatica is giving significant attention to “the use cases that help customers navigate” their world, such as using data to hold onto existing customers more than acquiring new customers. Fighting churn is the key in such an environment.
As far as Informatica’s own expenses, he’s not cutting anything now, but rather, “Stepping back and looking at our own execution and seeing what we could do differently.”
That includes things such as re-alignment of sales bonuses. There has been a two-tier compensation structure to incentive Informatica’s reps to sell more cloud services. But now that all the products being sold are cloud, the compensation can be simplified to a single rate, he says.
“Now that we are cloud-only, we don’t need to skew our compensation plan.”
Expanding the company’s partnerships with Microsoft, Snowflake, Databricks and many others will also be part of the tactics and strategy for dealing with uncertainty.
Also important moving into 2023 will be how to invest in product capabilities in the smartest way. The focus of investment, says Walia, is what the company calls “Clear AI,” its tools for automating tasks based on machine learning.
“Our customers are going to need automation to do more with less,” he says. “That’s an area we’re going to push very hard on.”
Informatica stock is down forty-eight percent this year including Wednesday’s after-hours decline.
Given the cautious trends identified by Informatica, you will probably be hearing similar talk of push-outs from many software makers in the coming weeks. You may want to take a look at my note last week about cash-rich tech stocks that may be something of a less-bad investment if you have to pick software stocks.
Alphabet and Microsoft don’t seem to read the tea leaves very well
Oct 26, 2022
The Street always looks to the largest companies to be seers of the future. It’s a role that some captains of industry have relished in past, such as former Cisco Systems chairman John Chambers.
Others are not so apt to be prophets.
Tuesday’s earnings reports after-hours brought downbeat results from bellwethers, including Microsoft and Alphabet and chip maker Texas Instruments. All three saw their shares sell off after-hours.
The unsettling part of the reports were the vague ways in which the companies spoke about the current economic climate. There’s a broad, shapeless sense that times are tough, things are uncertain, and that there’s absolutely nothing these giant companies can really say to measure the depth of things.
Following Alphabet’s disappointing press release, with revenue missing expectations by two billion dollars, analyst Mark Mahaney of Evercore ISI asked Alphabet CEO Sundar Pichai on the conference call this evening if he could give a sense of how this current economic environment compares to economic cycles in past, given Pichai has been in the business for a while.
Pichai was rather cryptic. “You know, I think compared to the past, I think going through this, I mean there is, as we have said before, there is more uncertainty as we go through,” he said. “We definitely see indicators on both sides, so that makes it a bit more unique.”
Pichai told Mahaney that Alphabet is fortunate to be able to look forward to several years of business growth because of its mastery of artificial intelligence, as if AI would counter-balance a broad economic malaise.
In Alphabet’s lower-than-expected revenue we see that, as I suggested Monday, there was plenty of room to cut expectations still further even after analysts had already cut their estimates all year long. Part of that shortfall was the continued rise of the U.S. dollar, which makes every company that reports revenue in dollars see a hit.
If not for the rising dollar, revenue growth would have been eleven percent instead of six percent for Alphabet. But there were vague allusions to something else. Revenue was “impacted,” Pichai said, by “the challenging macro climate.”
Alphabet’s chief business officer, Phillipp Schindler, told analysts that, “There's no question we're operating in an uncertain environment, and that businesses big and small continue to get tested in new and different ways depending on where they are in the world.”
Schindler did, however, try to give some more texture. Advertising, no surprise, is a weak area at the moment.
There was, noted Schindler, a “pullback in spend by some advertisers in YouTube and Network, and these pullbacks in spend increased in the third quarter,” and, in particular, a "pullback in spend by some advertisers in certain areas in Search ads” such as financial services, insurance, loan, mortgage and crypto-currencies, he noted.
Alphabet’s CFO Ruth Porat made a point of reminding analysts that Google was up against “tough comps” in terms of revenue growth, because sales had soared in the same quarter a year ago by forty-one percent. Those tough comparisons continue this quarter.
The bright spot was Google’s cloud computing business, for which revenue was up thirty-eight percent from the prior-year period, at just under seven billion dollars in revenue.
As big as Alphabet is, the company is going to have to rein in some spending, said Pichai.
“Times like this are clarifying,” he said. Alphabet, he said, has “started our work to drive efficiency by realigning resources to invest in our biggest growth opportunities.”
That includes hiring talent more slowly than has been the case, and “making important trade-offs where needed” as well as “focusing on moderating operating expense growth.” Note that Alphabet is still losing a billion and a half dollars per quarter on its “Other Bets,” things like the Waymo driverless car effort.
It certainly brings home the current economic concern to hear gigantic companies such as Alphabet talking about tightening their belts. Analyst Mahaney tried to get Porat to discuss just how the company will do that, but she, too, offered very little in the way of specifics.
“We're trying to be smart about redeploying where we can, find efficiencies where we can while still investing for long-term growth.”
No doubt AI has the answer!
Over at Microsoft, revenue and profit for the quarter were higher than expected, but the revenue outlook for this quarter was well below expectations, $52.85 billion, at the midpoint, versus consensus of $56.1. That three-billion-dollar miss is a six percent miss, the biggest miss in forecasting in many years.
Most of that difference is the rising U.S. dollar, said CFO Amy Hood on Tuesday’s call. The Street consensus number is expecting 8.6% growth this quarter, but the forecast she gave is more like 2% growth. That still leaves a point or two of the shortfall that is not attributable to the rising dollar.
Some of that shortfall is the ongoing and well-documented slump in personal computer sales, which Hood said will continue this quarter.
But the Street has its suspicions there’s more at work here.
Analyst Keith Weiss of Morgan Stanley noted that the results in Microsoft’s Azure cloud computing division have been below expectations for two quarters in a row. “I think what investors are worrying about, or, sort-of, wondering about is, is there an inherent volatility in that business that's just harder to forecast?” asked Weiss.
Yes, said Hood, there “is some inherent volatility” to the results in Azure from quarter to quarter. But, Hood also alluded to belt-tightening by Microsoft’s own customers for cloud.
“What we did see through the quarter is a real focus, both by customers but also by our sales and customer success teams, on going proactively to customers and making sure we are helping them optimize their workloads.” Uh-oh. Optimization here sounds like people are feeling a bit pinched. Hood told Weiss this is even more the case among small and medium businesses.
It will be interesting to see how that plays out for DigitalOcean, a company banking heavily on cloud services for small and medium businesses. DigitalOcean shares were down six percent in late trading this evening, perhaps in sympathy.
When analyst Mark Moerdler of Bernstein pressed Hood on the matter of Azure growth — why is growth slowing? he asked — Hood replied that some of it was this optimization stuff, and some of it is because “there is per-user headwinds as well because we're getting and seeing some of these laws of large numbers in terms of the per-seat business.” That sounds like a clever way of not really saying anything much at all.
CEO Satya Nadella was upbeat about the cloud. He remarked that even though customers are “optimizing,” meaning, spending less, still cloud computing is a big winner in uncertain economic times because it’s a way for companies to offload some of their expense onto the service provider.
Over at Texas Instruments, where revenue and profit both beat expectations for the September quarter, but the forecast for both missed consensus, the job of forecasting is a little easier because TI doesn’t really say much of anything. Head of investor relations David Pahl and CFO Rafael Lizardi confined themselves to speaking only about the particular market trends where they sell chips.
The market for personal electronics continues to be weak, which is not surprising given the breakdown in smartphone sales and PCs this year. And they see some weakness spreading throughout the industrial equipment market for chips. The automotive market, however, is doing very well and should continue to hold up.
Part of the problem for TI as a bellwether is that its business in those chip markets is so broad, it is as if the company can’t see the trees for the forest.
“Our business model is such where we target the vast majority of our parts sell to many, many customers,” said Lizardi. “So, they're very broad in nature. The product life cycles of the parts is decades in many times.”
So much for pontificating. The most important things to know about TI may be beyond this current economic cycle. Those important things include the fact that the company has spent many years diversifying away from personal electronics, so it doesn’t suffer as much as some other chip makers from the decline in phones and PCs.
And the other important thing is that TI has its own factories, lots of them, in the U.S. The company is especially excited, said Pahl, about the CHIPS Act and what it will do to bring TI vast sums of money to expand manufacturing here in the U.S.
TI shares are down about fourteen percent this year with tonight’s after-hours decline. Microsoft is down thirty percent, and Alphabet stock is down ten percent.
Molehills out of Mountains: The week ahead for AAPL, GOOGL, AMZN, MSFT
Oct 24, 2022
This week will be MegaWeek for earnings season, as I tend to think of it, the week the biggest firms in tech report, with Alphabet and Microsoft reporting on Tuesday after the closing bell, and Apple and Amazon on Thursday. The full lineup is in the table at the bottom of this post.
Monthly estimate change for revenue in the most recent quarter. Source: FactSet.
The accompanying charts show how estimates for the most recent quarter, the September-ending quarter, have changed month by month for all four companies. One chart is for the revenue number for the quarter, the second chart is for the earnings per share, or EPS, estimate.
As you can see, prospects for earnings and revenue for all four have been cut since the beginning of the year. Revenue estimates have been cut by five percent, on average, while EPS estimates have come down by twenty-three percent.
Revenue has held up relatively better than EPS, in other words, but this week could very well see the Street take an axe to those revenue estimates once again.
Part of the Street’s job between quarters is to make wild guesses without much help from the companies. Alphabet never offers a forecast. Apple hasn’t forecast anything in over two years. Amazon forecasts sales but not profit. Microsoft is the only one that forecasts both revenue and profit.
All of these businesses are susceptible to further erosion beyond the cuts so far, given that they are tied to consumer spending in no small way. Apple has so far proven the most resilient: it is the only one of the four that has beaten expectations this year for both revenue and EPS.
Monthly estimate change for EPS in the most recent quarter. Source: FactSet.
In fact, Apple is the only one that has seen a slight increase in estimates for revenue and profit in the past month. That is despite a lot of hand-wringing about whether people are buying iPhones or not. While Apple no longer discloses iPhone shipments, the company’s remarks about inventory will be a clue for analysts to parse to make up their own numbers.
Alphabet is facing an advertising market that is apparently weakening, based on Snap’s results this past week, even though midterm U.S. elections should be a boon to advertising.
Amazon is facing not only the prospect of inflation limiting purchases, and recession crimping spending, but also the continued elevated cost of logistics, which could eat into profits.
And Microsoft may or may not have more bad news to offer about the plunge in personal computer sales. More important will be what its cloud computing results say about how data center spending is holding up.
Apple, Alphabet, Amazon and Microsoft shares are down 17%, 30%, 28%, and 27% this year, respectively.
In Barron’s Advisor: Picking cash-rich software stocks
Oct 21, 2022
The bulk of earnings season gets underway next week, when giants including Microsoft and ServiceNow will report. In anticipation of the action, my latest missive for Barron’s Advisor took a look at which software makers have ample free cash flow to go the distance. (Subscription required to read Barron’s Advisor articles.)
Last quarter, as I chronicled on a weekly basis, software makers warned of delays in deal signings, what they term “put-outs,” where more scrutiny is brought to bear on software sales by customers.
My premise for the Barron’s Advisor article is that we will see an increase in this trend during the current reporting season. October is typically a time when companies evaluate budget priorities for the coming year. I expect that such activity by software customers may start to show up in the remarks that software vendors offer about their own outlook, and perhaps even their formal forecasts.
If so, nervous software investors may look for reassurance in the profit profile of software companies. This year has seen an end to “growth-at-any-cost,” and greater scrutiny of the P&L and the cash flow statement. Investors suddenly want to know that software makers can be profitable.
In that vein, my article looks at almost a hundred and fifty U.S.-listed software companies to see what their state of cash flow is. While I boiled down the list to ten names for Barron’s Advisor, below I provide a rather more lengthy list of forty-seven stocks that are already generating meaningful cash flow and have double-digit projected revenue growth.
Source: FactSet
The table’s central feature is the column showing expected free cash flow projected over the next twelve months on a dollar basis, with Microsoft being absolutely in a class by itself with an expected sixty-nine billion dollars in free cash flow.
You can make your own decisions about how to consider sorting these names: by their free cash flow, by their free cash flow “yield,” which is free cash flow divided by the company’s total market cap, or by their rate of revenue growth. Averages are shown at the bottom of the table in case you want to pick names that are above average by any measure.
One rather unusual measure is what I call “The Don Valentine Ratio.” The late, great Don Valentine, who pioneered venture investing in Silicon Valley, said you only need to know two things about a business, its gross profit margin, and its free cash flow. Hence, the Don Valentine Ratio is free cash flow divided by gross profit. It basically means, how much of a company’s gross profit does it manage to turn into a cash profit after spending on the essentials. I think it’s a very interesting ratio to follow. On average, the software makers convert a little over a quarter of their gross profit, 0.27, into free cash flow.
I’ve also included one valuation measure, the company’s enterprise value as a multiple of the next twelve months’ projected sales.
I’ve also offered below a rundown of coming earnings this month. Names among the forty-seven software stocks are highlighted in green.
ASML: We have no clue where all these chips are going
Oct 20, 2022
Shares of chip equipment maker ASML Holding surged Wednesday by over six percent following better than expected third-quarter results, and a better-than-expected outlook.
The immediate reaction of anyone watching the chip world these days might be astonishment. There has been a steady stream of negative announcements from chip makers within a short span of time that suggest the chip market is in free-fall.
Consider that Taiwan Semi’s CEO C.C. Weilast week said chip companies continue to “adjust their inventory,”echoing Advanced Micro Devices’s CEO Lisa Su, who had said the week prior that the PC market has weakened “significantly” in the past ninety days. And memory-chip maker Micron Technologytwo weeks ago said it will cut its capital investment by forty percent and that the collapse in chip demand is “unprecedented.”
All these companies are customers of ASML, either directly or indirectly, so how is it ASML is doing just fine?
There’s a short answer and a long answer.
The short answer is that while demand is breaking down in certain markets such as PCs and smartphones, there is no broad, general oversupply of chips. As I suggested two weeks ago, a market with tight supply overall is a healthier market than one with a glut from over-building.
ASML sells tools to make chips years into the future, and on Wednesday, CEO Peter Wennink, during the company’s conference call with analysts, said ASML still faces a shortage of materials to make its tools. As has been the case all year long, his company can’t build its equipment fast enough to meet demand.
“There’s still such a big gap between the demand side and what we can make,” said Wennink.
“Looking to next year,” said Wennink, “With demand expected to remain significantly above supply, and based on discussions with our customers, we're planning to increase our system output next year.”
The company has its highest-ever backlog, thirty-eight billion dollars worth, said Wennink. Remember that ASML is the sole supplier of its kind of chip-making equipment, lithography tools. It is also a boutique supplier. It supplies tens of machines per year of each kind of machine, not millions. These machines are like small-batch whisky: you cannot crank up supply quickly enough.
Asked about Micron’s cut to its spending plans, Wennink said it doesn’t matter, everyone needs capacity beyond the moment, for years down the road. ASML is selling equipment to be put into place now for chips that won’t appear for another two years or more.
“Some of our customers … indeed look at their CapEx guidance for next year and have taken it down, but those same customers, in the same breath, tell us, Listen, we need those machines,” said Wennink.
“And these are the machines that we need for 2023, because they're strategic, they’re long-term in nature, and if you ship me 2023, it will only be 2024 output.”
Wennink was asked about statements by Micron and others that they are running their factories at a lower rate. But, said Wennink, that is a decline from a huge surge in production the past two years, when the world couldn’t get chips fast enough.
“If you look at the decrease of utilization … it comes off a peak that we've never seen before,” said Wennink. “So, it's not that you see a steep decline, you see it leveling off.”
So, the short answer is that supply of the most sophisticated equipment to make the most sophisticated chips is still tight. Supply drives things, it determines whether the market is healthy or bloated. Right now, it’s running lean and there’s no sign of that changing.
There is, however, a long answer, and it’s somewhat mysterious. Analyst Joe Quatrotchi of Wells Fargo pressed Wennink on the issue of chip demand. Could there be some kind of “air pocket” in 2024, he wondered, meaning, another collapse in demand?
“That's a reasonable question,” said Wennink, “but the answer is, we have no clue because nobody knows 2024, I mean, we are struggling to understand 2023.”
Wennink was further pressed on the matter by analyst Amit Harchandani of Citigroup. Who is putting in all these orders for equipment, and why? asked Harchandani.
“There's not one firm on the planet that actually has the full insight into where all these chips are going, and where they are being designed into,” said Wennink.
“It's consumer, it's industrial, it's automotive, it's energy transition — just the sheer application space has grown so much,” said Wennink.
Wennink said his customers don’t know, either. “The CEO of one of our largest customers said, when I asked the question, because you have a very significant market share, you should know, said, ‘Yes, I have no clue.’”
“This is exactly because nobody connected the dots,” said Wennink. “And if you ask me exactly where it goes, I have to also say, we don't have that full clarity.”
So, the long answer as to why ASML is seeing more business than it can handle is that demand has a mysterious, expanding quality, years into the future. The use of semiconductors is expanding so fast, even industry insiders don’t know what is going on.
Netflix: Not fantastic, but reasonable
Oct 19, 2022
“Well, thank God” the company’s subscriber count is no longer decreasing, said CEO Hastings. The company will no longer forecast quarterly subscriber additions, removing one of the favorite footballs of the Street to quarterback the company’s progress.
Shares of Netflix in Tuesday evening’s after-hours trade were up fourteen percent, perhaps a short squeeze as the company delivered better-than-expected September-quarter results and a slightly better outlook, after what has been a disastrous several quarters of little to no growth.
Netflix’s “net” additions, meaning, how many paid subscribers they added after giving effect for churn, was 2.4 million, which was better than the company’s own forecast for just a million.
“Well, thank God, we’re done with shrinking quarters,” said founder and CEO Reed Hastings during the company’s conference call, which, as in past, was hosted on YouTube as a kind of TV show, with only one analyst participating, JP Morgan’s Doug Anmuth.
The outlook for this quarter’s paid subscribers, 4.5 million, is slightly higher than the Street’s average 4.3 million estimate. Said Hastings, “The results this quarter, and the guidance for Q4, are reasonable — not fantastic, but reasonable.”
The 4.5 million number is a huge comedown from past Netflix history. The average net subscriber gain in Q4, over the preceding five years, was 8.2 million. Hence, Netflix has got a lot of ground to recoup to get back to growth.
Interestingly enough, the company is now divorcing itself from the subscriber number. Going forward, Netflix said in its shareholder letter, the company will no longer forecast subscribers.
“Starting with our Q4’22 letter in January of 2023, we’ll continue to provide guidance for revenue, operating income, operating margin, net income, EPS and fully diluted shares outstanding for the following quarter, but not paid membership.”
This is akin to how some other companies have stopped talking about certain numbers. Apple, for example, some years back stopped disclosing iPhone unit sales.
Hastings and team are now increasingly focused on revenue growth as their main measure of success. That growth is currently anemic, at six percent, year over year, last quarter, versus sixteen percent a year earlier.
Some of that anemic growth is the hit from the rising U.S. dollar. Growth would have been thirteen percent otherwise.
The expectation of Hastings and team is that growth will pick up as a result of the the company’s planned roll-out of its ad-supported programming next month. For $6.99 in the U.S., the subscriber will get all the same Netflix programming but have to endure five minutes of ads per hour. Hastings and his COO, Gregory Peters, noted that the company is being very strict with “caps” on the amount of ads served, to preserve the quality of the user experience.
COO Peters told Anmuth that, even if some existing users switch to the lower-priced ad-supported plan, nevertheless, “we expect this leads to a significant and incremental revenue and profit stream.”
The other big change that’s coming is “paid sharing,” where people who’ve been mooching off of someone else’s account are given a kind of amnesty to come clean by creating their own profile of preferences. The hope is that maybe some cheapskates will be induced in that fashion to at least pay up for the ad-supported version.
Credit where due, Netflix has thought through its challenges and the introduction of an ad-backed offering and the paid sharing idea are commendable attempts to deal with those challenges.
Despite tonight’s pop in the stock price, the question will remain as to how Netflix’s success is to be measured for the seventeen billion dollars it spends annually for content.
Co-CEO Ted Sarandos offered a vigorous defense of the company’s content spending.
“We started this ten years ago, we had no IP, we had no library, we moved as quickly as we could to build a library of our own IP and to build our own library,” said Sarandos. “And in those ten years, that library now gets more viewing, more revenue and more profit than all of our competitors who have been at it for over one hundred [years].”
Hard to argue with that. Except that in those ten years, the Street always measured success by subscriber additions. Either growth will return with the addition of ads, or else investors will start to the measure the company by some other metric.
That something else might be free cash flow. The seventeen billion in cash expense for content is expected to leave about a billion dollars in free cash flow this year, a three percent free cash flow margin, and about two billion dollars next year, which would be about six percent.
That’s not a huge payoff for all the investment, but if the horizon continues to look more cash-rich, then perhaps the right-sizing of Netflix will be what investors cotton to.
The most important thing, as far as Hastings is concerned, remains the same: “Linear television is going off a cliff,” as he put it, citing the remarks of Disney’s Bob Iger. It’s going off a cliff in terms of hours viewed — streaming content is now more time spent by viewers than broadcast and cable, combined, he noted — and linear is also “collapsing as an advertising vehicle,” said Hastings.
With “connected TVs,” those sets that have Internet hook-ups, pushing more and more people to watch streaming, said Hastings, “think of it as pretty steady every year, climbing share,” meaning, streaming taking share from linear TV.
The fight going forward, said Hastings, is to “have the best content” and “the lowest prices.”
“We’re pretty excited about this next phase,” he said, “which is competitive excellence … if we can just be better than everybody else.”
With tonight’s fourteen percent pop to $275.50, Netflix shares are down fifty-four percent this year.
Couchbase CEO: There is lots of room for database competition
Oct 19, 2022
The database is a computer technology that once seemed pretty settled terrain given the success of Oracle, IBM, Microsoft, and a few others. But it is such an important technology that there have been many attempts to redefine what the database is for an era of cloud computing.
And, so, the past decade has seen new products such as Amazon’s “RedShift” database, and new companies, including Snowflake and MongoDB, and Databricks, each with bold and intriguing new approaches.
Some would say that despite decades of development, the battle for the future of the database is just beginning.
“I do believe there will be an emergence of more database companies,” says Matt Cain, the CEO of one of those companies, Couchbase. “It’s a sixty-billion to one hundred-billion-dollar total assessable market, and the demand is significant.”
Fourteen-year-old CouchbaseofSanta Clara, California,is the latest promising database startup to come public, in July of last year, just before the IPO window shut.
The potential hundred-billion-dollar market Cain describes, what he calls a “generational market opportunity,” is the proliferation of apps by all companies as they “digitally transform,” meaning, reach more and more of their customer base via software.
“There is an insatiable demand for applications,” Cain tells me, in an interview we had recently via Zoom.
“There is not a single company that is not thinking about how to use technology to get closer to their customers,” he says. “But behind every applications is a database.”
And that database for modern applications, he says, needs to be of a certain kind.
“Why have we emerged?” he asks, rhetorically of Couchbase. “We are fulfilling a need that just can’t be fulfilled by MongoDB, or the hyper-scalers, or, certainly, legacy providers” such as Oracle.
As is often the case these days, that technology edge has its roots in open-source software.
Around 2008, a startup called CouchOne was formed by Damien Katz, who had created CouchDB, an open-source database. CouchDB had a special ability to represent information as documents, which allowed it to work with all kinds of data, not just the “relational” columns and rows of traditional databases.
A second company, MemBase, was formed around the same time to exploit another open-source project, Memcached, a technology that allows multiple server computers to share their DRAM memory circuits, thus speeding up processing by “caching” data.
The merger of the two startups in 2011 formed Couchbase, and it created a product with attributes particularly well suited to the cloud computing era. They included the ability to make a database “scale” across all kinds of computers, from servers in data centers down to smartphones, and to do so while accommodating all kinds of data, including images and video, things not handled by relational databases.
Couchbase, along with a whole field of challengers to the status quo, came to be known as “NoSQL,” or “not only SQL.” It was a statement of defiance, and also a dig at the S-Q-L query language of the status quo.
The late co-CEO of Oracle, Mark Hurd, dismissed NoSQL as much ado about nothing when I asked him about it 2015. What Hurd called a fad tuned out to have real momentum, despite some false starts.
A NoSQL startup named MarkLogic was taken private in 2020 by private equity firm Vector Capital Management. But another NoSQL name, MongoDB, came public in 2017, and, up until this year’s collapse, its stock had a great run. Established companies such as Amazon and even Oracle ultimately climbed aboard, offering their own No-SQL databases.
While Katz and many of the other co-founders of Couchbase moved on, the company in 2017 gained what you’d call adult supervision with the arrival of Cain. He had large company experience, including running worldwide field operations for data backup company Veritas, later sold to private equity.
“I was fortunate enough to know one of the early investors,” recalls Cain. “He called me and said, ‘Let me talk to you about the best company you’ve never heard of.’” As they discussed it, Cain said he realized a substantial “disruptive” market opportunity was on the horizon.
Cain is acutely aware that Couchbase has competition from MongoDB but also the dominant companies such as Amazon and Oracle that now offer their own NoSQL product.
MongoDB’s success is well-earned, he says. “MongoDB is a document-based database, and they have proven how useful that can be,” he says. “What makes Couchbase unique” from MongoDB and the other NoSQL offerings, he says, is the union of the former CouchOne and Membase in 2011, the combination of document database and caching.
“If you put those two things together, that becomes the foundation of a platform that will serve the highest-performing applications over time” by offering not just speed, he says, but also “a very flexible data schema to serve the types of applications we’re building for.”
Profitability is “an eventuality,” says Cain, “the question is what is the pace.” There are a lot of investors, he says, “who would say, Keep investing, there’s so much upside."
A “schema,” in this case, is the way that the database organizes information. The traditional relational database sold by Oracle and IBM and others is based on a columnar format, rows and columns like a spreadsheet. The NoSQL document database is more fluid, it can organize information with any number of attributes.
“They’re trying to hang on,” he says of Oracle and other database giants, but “relational [databases] don’t have the schema to support modern apps,” says Cain. “If you talk to anyone of our customers, they would say, you can’t extend Oracle to provide the same value” as Couchbase.
While cloud giants such as Amazon have their own NoSQL databases, “We have to differentiate on scale and performance, but also running in every cloud, and cloud to edge,” because enterprises want to use multiple service providers.
The most intriguing aspect that may distinguish Couchbase is one I was not aware of, which is that the company has fashioned versions of Couchbase to run in an embedded fashion. The “lite” version of the database can be installed on “Internet of Things” devices or mobile phones, places that would normally be too resource-poor to run a database of any meaningful ability.
“What Couchbase did was, we said, If we think about the future of applications, they are not only going to be running in data centers, on premise or in the cloud, but all the way out to the edge,” says Cain.
The edge, in this case, is the panoply of devices outside a data center that need their own database.
“I would be willing to bet you a cold beer,” he says, “that you have five to ten couchbase instances in the phone in front of you because Couchbase is running co-resident inside the application in the device.”
“Think about how many times you’re touching things on a mobile device, whether it’s streaming content or interacting with your banking platform, or engaging with your healthcare provider or playing an online game,” says Cain.
“All these things are happening in your mobile device, and it’s a fundamentally different challenge to support that from a database, and we have architected that from the beginning.”
All of Couchbase’s capabilities, says Cain, are in service of “trying to solve the hardest and most sophisticated computer science problems in the industry, in supporting some of the world’s largest, most mission-critical, high-performing applications.”
The mission-critical apps might surprise you.
A signature use of Couchbase is onboard the Princess Cruise Lines of Carnival Corp. The company uses Couchbase inside a pendant the size of a quarter that can be worn as a bracelet or necklace, called a “Medallion.”
Guests on the ship who wear the pendant travel in what Carnival calls “MedallionClass.” Using wireless technology, the pendant lets a guest have “contactless boarding” of the ship, keyless entry to their cabin, instantly locate friends and family onboard the boat, and get anything and everything delivered to them wherever they are with a tap on the companion smartphone app.
The medallion becomes a personal tracker with all sorts of possibilities.
“You and I could sit down at the bar for the first time, have a beer, they’ve now created a social hypothesis that you and I have befriended each other,” explains Cain. “Unbeknownst to us, two days later, there’s another sporting event, they send an ad to each of us saying, fifty percent off on beers in the next hour and a half, we both show up, and it’s, ‘Hey, Tiernan, how’s your cruise going?’ — all of that runs on a Couchbase platform in a wearable.”
The medallion, moreover, becomes a way for the cruise line to plan inventory and services. All those medallions send data to a Couchbase instance running in the cloud that can be sorted and sifted for analysis.
“This is the example of completely transforming digitally,” says Cain. “There is no platform on the planet that was architected for that use case.”
And the value of those things becomes greater than just the immediate revenue that Couchbase earns.
“Where the revenue resides based on our pricing model doesn’t fully account for how strategic the technology is,” says Cain.
I point out to Cain that for me, personally, a pendant that tracks my every move is a bit spooky. “It’s cool,” is his rebuttal, and even “mind-bending.” You can always opt out, he points out. It will come down to the application developers finding what works for their customers. But the implications for the database are signficant.
“That is the definition of digital transformation, that is what Couchbase has been built for,” he says.
So far, the strategic applications are fueling decent growth but not profitability. Couchbase told analysts last month that it expects to make around $150 million in revenue in the year ending in January, a twenty-two percent year-over-year gain.
The company’s track record is good, having notched five quarters since the IPO in which revenue beat expectations. But the company is also expected by the Street to lose $1.17 per share this year, excluding some costs. Both earnings per share and free cash flow are expected to remain negative through 2025.
In the current climate of greater fiscal probity among investors, how do his shareholders view that loss-making outlook? I ask.
“We take our fiduciary responsibility with the utmost sensitivity,” says Cain. “What is most important to investors is that they understand that we have a strategy to build a great company over time, and that we also have the operational handles and levers to flex as business conditions and macro economic environments dictate that we need to, in service of building a great company over the long term.”
Having operated through multiple economic cycles, says Cain, the company has proven it can rein in spending or expand it as need be.
But, he says, the company is “still in investment mode.” Couchbase’s cloud version of its software, called “Capella,” introduced a year ago, is going to “take us to new heights,” he says. The service, which runs in public clouds including Amazon AWS as a rentable service, is a focus of Couchbase’s spending on product development.
“There are a lot of investors who would say, Keep investing, there’s so much upside,” he observes.
Profitability is “an eventuality,” he says, “the question is what is the pace.” The company will probably host an “analyst day” meeting in the first half of 2023, he tells me, the implication being that there will be some goal-setting for profit at that time.
I point out to Cain that at a recent price around $12, the stock is fairly cheap, trading at a multiple of next year’s sales of less than three times.
Is the stock a good buy? I ask Cain.
“Look, I tell everybody with as much conviction as I can that the best days of Couchbase are in front of us,” says Cain.
“I jumped out of bed this morning knowing that today is another day I get to work with the other Couchbasers around the world,” he says. “We are very bullish about what we’re doing, and, unquestionably, we think there is tremendous upside at Couchbase.”
Couchbase stock is down fifty-one percent this year, and fifty-six percent since IPO.
Bitcoin chips, puppy Web sites among Q3’s terrible IPO returns
Oct 15, 2022
Even though the initial public offering window has slammed shut this year, that doesn’t mean that no tech companies came public. In fact, if you paid attention in the third quarter, July through September, you got to see eleven rather curious tech companies coming public, representing everything from puppy-enthusiast Web sites to touch-less gesture controls to crypto-currency mining.
The singular achievement of this cohort of eleven offerings is that it represents a huge jump from only five offerings in the second quarter and eight offerings in the first quarter.
It was, however, a pretty awful quarter to go public, with absolutely terrible price declines.
It started out just fine. The average first-day pop, meaning the rise in price from the offer price to the closing price on the first day of trading, was an average gain of sixty-three percent, which is wonderful for the selling shareholders who cashed out.
But with the broad market declines in late August and September, the average decline in price from the offer price till today, and from the first-day close till today, was fifty-seven percent and fifty-five percent, respectively. Ouch!
They are all below their offer price at this point, and only one company has notched a gain from the first-day price, Laser Photonics, mostly because it had a big nineteen percent jump on Friday.
And valuations have cooled off, the average price-to-sales multiple for these stocks being just twenty-one, down from twenty-eight last year.
Three of the eleven companies are “pre-revenue,” as they say. But I wouldn’t call these companies the dregs of IPO. Perhaps they are the brave companies willing to stick their necks out. They certainly raised money, a grand total of $138 million, and an average of $13 million.
They are an odd lot, though, I must say.
You can divide the eleven into “hard” tech, things that might take an engineering degree to get off the ground, like chips and sensors, and “soft” tech, things that are more about reinventing a fairly straightforward type of business that is not necessarily very tech-heavy.
In the first category, hard tech, an example is Nano Labs, based in the city of Hangzhou in the Zhejiang province in China. The company, founded in 2019, makes designs for chips to crunch numerous crypto-currencies, including Bitcoin and Ether.
That’s interesting because there is already pretty fierce competition in the market for crypto mining chips, including the two dominant vendors BitMain and MicroBT. Intel is also in the market with some initial chips, and, of course, Nvidia is a huge competitor even though they don’t have the market share of BitMain and MicroBT.
Nano Labs doesn’t mention those competitors by name, merely states that it faces “intense” competition.
Another hard tech firm is Mobilicom of Tel Aviv, which has been around for sixteen years. The company notes that it has patented hardware and software technology to embed a variety of functions into drones, or “small, unmanned aerial vehicles,” SUAV, as they’re termed.
The two main pieces of art the company claims are a modem technology to efficiently transmit data wirelessly, referred to as “joint beam formation and synchronization,” which is covered in a 2013 patent; and an algorithm to form “mesh” networks spontaneously, or “ad-hoc,” covered by a 2020 patent.
Unlike some other companies, the firm has revenue, a total of two and a half million last year. It’s also losing money, almost two million dollars last year. The eleven million in proceeds from the August 25th offering will help in that regard.
There was even a consumer tech company that came public, eight-year-old, Tel Aviv-based Wearable Devices, which won an award last year at the CES trade show in Las Vegas for its Mudra Band, a watch strap for your Apple Watch that lets you control the watch without touching it, just by making gestures with your fingers.
The Mudra technology — “Mudra” is a Sanskrit word for “gesture” — involves something called “Surface Nerve Conductance,” or SNC, which lets the band “track neural signals on the surface of the user’s wrist, which our algorithms decipher to predict as finger movements or hand gestures.”
The sensors “can detect multiple types of gesture,” says the company, including hand movements, finger movements, and fingertip pressure gradations. There’s also the prospect of using the bands for digital health, the company says.
Wearable Devices will also offer a “kit" for developers in corporations to build on the technology. However, the company hasn’t mass-produced anything yet. Several million dollars of the fifteen million raised will be used for mass-production of the $179 Mudra Band and for marketing and sales.
On the softer side of tech, there’s two-year-old OnfolioHoldings of Wilmington, Delaware, which received thirteen million in its August IPO to buy up odd Web properties that are “under-monetized.”
The company spells out its business plan in the prospectus: “We believe there are opportunities to acquire ‘distressed' eCommerce and content websites, or where the sellers have not optimized the website to the fullest,” and then determine the “leverage points and growth opportunities that the current website owners have not fully utilized.”
It seems a daunting task given these are somewhat oddball sites. The eighteen sites owned include allthingsdogs.com, Vital-Reaction.com, a site for sales of molecular hydrogen tablets, and onthegas.org, a site with recipes for things like “beer can chicken” and reviews of cooking equipment. And I thought Technology Letter was a niche.
Another soft-tech venture is sixteen-year-old GigaCloud, based in Hong Kong, whose business is to streamline business-to-business commerce pertaining to large “parcel” items, such as furniture and home appliances. The company acts as a kind of hub to bring together third-party sellers of such goods with resellers who would like to sell the merchandise on their own Web sites.
The company’s marketplace has been live since January of 2019. GigaCloud describes the offering as “a true comprehensive solution that transports products from the manufacturer’s warehouse to end customers, all at one fixed price.”
A big part of the plan is providing the fulfillment capabilities merchants need. The company notes that it operates “warehouses in four countries across North America, Europe and Asia,” including twenty-one “large-scale warehouses around the world totaling over four million square feet of storage space, cover 11 ports of destination with over ten thousand annual containers,” as well as “an extensive shipping and trucking network via partnerships with major shipping, trucking and freight service providers.”
Oh, and the company uses special AI software, it says, to model seller ratings and credit profiles. You have to have AI in your business plan these days.
GigaCloud has some real revenue here, totaling $414 million last year, and net income of $29 million, not bad. That’s with 382 active sellers last year, and 3,566 active buyers spending an average of almost a hundred and twenty thousand a year.
Are there any winners here? None that immediately stand out to me, but I wouldn’t count them out, either. I tend to lean toward the hard-tech companies, just because the value of patents seems to me a clear point of differentiation for those companies.
I will tell you this: at an average stock price of $2.19 for all eleven, you could certainly roll the dice on any of them without it costing you much.
In case you were wondering, this brave cohort has not thus far reignited the IPO market. In the first two weeks of October, no tech companies have gone public.
Taiwan Semi says demand for cutting-edge chips remains, plays down China risk
Oct 13, 2022
Taiwan Semiconductor Manufacturing, the largest contract chip manufacturer in the world, overnight reported revenue and profit higher than the Street was expecting, and forecast this quarter’s revenue higher as well, and the tone was relatively upbeat considering how much worry there has been about the chip industry this year.
During the company’s conference call with analysts, CEO CC Wei said chip companies continue to “adjust their inventory,” basically the same comment that memory-chip maker Micron Technology had made during its call last week. The sudden drop-off in demand for PCs and smartphones means that chip makers have to find a way to sell off inflated inventories of chips before they contract with TSM to make any more.
Even so, TSM’s revenue rose by thirty-six percent, year over year, to $20.2 billion, topping consensus for twenty billion, and in line with TSM’s own forecast.
The company’s forecast for the current quarter is a range of $19.9 billion to $20.7 billion, which is, again, ahead of the consensus for twenty billion dollars.
Numerous analysts tried to pin down Wei, and CFO Wendell Huang, about the timing of burning off that excess inventory. The duo stuck to a broad statement that things get better in the back half of next year. “It will take a few quarters through first-half 2023 to rebalance to a healthier level,” said Wei of the inventories.
The really astounding thing is that Taiwan Semi’s most cutting-edge chips, those measuring three billionths of a meter, or three nanometer, at their critical dimensions, are in greater demand than the company can currently supply. Revenue from three-nanometer, said Wei, will be higher than was revenue from the prior cutting edge, five-nanometer, when that generation was introduced in 2020. Those “three nano” chips, as they’re called, are being used for high-performance computing, probably things such as Nvidia’s latest and greatest chips, for AI and such, and for smartphones, probably Apple’s next thing.
Hence, inventory build-up is occurring at the same time as raging demand for the latest and greatest TSM can offer.
The healthiest part of the market for Taiwan Semi, said Wei, is chips for data centers — obviously, because of AI — and for automative applications — obviously, more and more chips in cars. Wei told analysts he sees no slowing down at the moment, but also said he “wouldn’t rule out” a slowdown at some point. When he was challenged on that point, Wei clarified that he’s trying to be realistic given broad macroeconomic worries.
“The data center and automotive-related are still steady,” said Wei. “But now, the market becomes soft and we are taking a more conservative way in our planning for 2023,” he said. “And that's why we say that we don't rule out the possibility they might have some correction also, but, you know, we do not see it right now, to be frank with you.”
The one area of most tension at the moment is China, specifically new regulations imposed by the U.S. Department of Commerce on sales of chips to the country. Wei remarked that his “initial reading” after talking with TSM’s customers, is that the regulations imposed are “very high-end specification, which is primarily used for AI or supercomputing applications,” meaning that it only affects a small group of chips.
That is similar to what Nvidia’s CEO, Jensen Huang, had said when asked about the matter during a press conference I attended last month. The regulations are being characterized as “specific,” meaning, limited in their scope and effect. More a way to perhaps annoy Beijing, or keep it off balance, is the sense being conveyed.
Said Wei, “Therefore, our initial assessment is the impact to TSMC is limited and manageable.”
Taiwan Semi stock is up almost three percent this morning at $65.69. The shares are down twenty-five percent since being named in the inaugural TL20 in mid-July.
Welcome to downgrade season, look for the peak later this month
Oct 12, 2022
If you were watching stock ratings on Tuesday, you would have seen what seemed like a cataclysmic implosion of faith. Not only did the Nasdaq Composite have its fifth straight decline, closing down one percent, but there were a whopping eight downgrades of tech stocks, the greatest number in a while.
But, not actually that unusual, it turns out. This is about par for the start of earnings season. The chart shows you how things looked at this point in time last quarter, starting with the month of June. Oracle had been the first to report, on June 13th, the unofficial start of earnings season.
Daily downgrades of tech stocks by number of downgrades. Gaps indicate days that had no downgrades of tech stocks.
I’ve highlighted a few major milestones. In purple, the most recent bar on the chart, are the eight downgrades today, October 11th. In red, you see first the same date three months ago, July 11th. Lo and behold, there were also eight downgrades that day. Same part of the season, eleven days into the first month of the quarter, and the same number of downgrades.
The second red line, the highest line of all, was the third week of the month of July, July 22nd, when there were, in fact, twenty-two downgrades. I don’t know if there’s a numerological significance to twenty-two on twenty-two. What we do know is that the huge number was inflated by a whopping eleven downgrades of one stock, Snap, whose advertising business was in free-fall.
So, today’s downgrades are not unusual in their number. And, similar to the eight downgrades back on July 11th, the cuts today were distributed across different parts of tech. The eight names are: F5 Networks, Rackspace, Meta Platforms, Ciena, Zoom Video, Qorvo, Skyworks, SunPower.
I would note, too, that the number of downgrades daily from June 1st through July 11th, 2.6, is precisely the same number of downgrades as we’ve seen so far from September 2nd to today. So, there’s nothing unusual about the pace thus far.
Now, this is a completely unscientific exercise in fitting data to a curve, obviously, but if I were to infer anything from this small sample, it would be that the downgrades will peak in two weeks’ time, the week of October 24th, given that it corresponds to the position in the season of those twenty-two downgrades on July 22nd. The week of October 24th will be one of the heaviest weeks of earnings season, with reports from Alphabet, Amazon, Apple, ServiceNow, Meta, Twitter, Intel, and many more, so it makes sense you might see a giant cluster of ratings changes.
Of course, past performance is no guarantee of future returns, but at least it’s someplace to start.
Databricks CTO: Making our bet on the lake house
Oct 11, 2022
The ambition of Databricks, to make the data “lake house” ubiquitous, has the audacious goal of giving everyone in a company access to all the data, not just some of it. It could be the birthing of a new era for the database industry.
The vibrant innovation we’ve seen in cloud computing in the past decade resulted in many intriguing companies coming to the public markets, such as Snowflake and MongoDB — that is, before the IPO window slammed shut this year.
With the cancellation of the IPO market, future stars have been kept on the sidelines. One of those still-private entities is so far along, it’s surprising they didn’t come public in the cohort with Snowflake and others.
Databricks, a nine-year-old software maker based in San Francisco, is on course to make a billion dollars in annual revenue, growing at a rate of eighty percent. The company has received almost four billion dollars in venture money over eleven rounds, so it is not in need of any immediate financing.
The other day, I talked with Databricks’s co-founder and chief technologist, Matei Zaharia, on Zoom. You cannot buy into a private company, but they can tell you a lot about the public markets that is worth paying attention to.
“Basically, we're trying to create a uniform sort of platform that you can use to manage all the data in an enterprise,” is Zaharia’s modest mission statement.
As Zaharia explains it, the database market has “two big platforms” for data. One is called a “data lake,” which is “a place you just put data in piles.” It could be spreadsheet data, it could be video files, it could be tons and tons of text documents. “That's actually where the majority of the bytes are” in the world’s aggregate data trove, says Zaharia. Usually, companies build data lakes by buying cheap storage at, say, Amazon AWS in which to dump their data.
“And then there's the data warehouse, which is where you curate the data into tables, and you add access controls, and you expose them to users,” says Zaharia. The data warehouse, a term that has been around for more than twenty years, is the manicured version of data, with everything neatly organized for the data scientist.
Snowflake is today’s paradigmatic example of the modern data warehouse. It uses cheap data lake storage from Amazon, yes, but then it groups data in structured ways that look familiar to an enterprise.
The “uniform platform” that Zaharia is building might be a kind of “third way,” if you will, a compromise. The amusing term for it is a “lake house,” meant to suggest the best of both data lake and warehouse.
“Our thesis on the market is that the best data warehouse is this lake house model where you can take all the raw data you have in Amazon S3 and you can turn that into very structured tables and get great performance out of it,” says Zaharia. (“S3” is the name for Amazon’s bulk data storage service.)
Databricks is a very broad platform for distributed computing, says Zaharia. “We don't have to translate everything to Spark,” he says. “A lot of what we do is simply the launching, managing of workloads, installing all the software on them, checking if they work well — that’s useful for anything you have to run, not just Spark.”
The ambition is very broad access to information, much more than what employees at a company typically got.
“We're trying to make the lake good for both, and evolving it into something so that you can have everyone work with all the data” in a company. “That’s actually the model that a lot of the tech companies like Airbnb and Uber and so on use internally already — everyone who started in the modern world, and designed the data stack based on what's available” rather than working with the old stuff.
“We're hoping to bring that to traditional enterprise workloads and make it usable by everyone,” says Zaharia. “Our bet is on the lake house model.”
The world of data, then, is something of a pitched battle between a Snowflake approach, the data warehouse, and the Databricks approach, the lake house, though there’s considerable overlap at the end of the day.
The dueling approaches are a culmination of fifty years of database development that has seen two great revolutions.
A database, generally speaking, is an organized way to keep track of information, such as, for example, names and addresses of customers.
Up until the 1970s, IBM was a monopoly provider of databases, given that they provided the mainframe computers that ran the database, and the disk drives on which the data resided. The process of using a database was fraught because you had to know how the IBM mainframe was constructed to do the simplest sort of retrieval of customer data. You had to be a computer systems engineer.
An IBM scientist, Edward “Ted” Codd, in the 1970s came up with a novel approach to databases. Codd figured everything would be better if the database presented a logical table of information, similar to what you see in your Excel spreadsheet, columns and rows, and hid the details of the computer. Give people access to data in a logical format, and it wouldn’t matter what was under the hood.
Thus was born the so-called relational model of databases, where the user manipulates relations between pieces of data. It was a profound insight, but, in true monopoly fashion, IBM dragged its feat realizing Codd’s vision. Instead, Larry Ellison ran with it and founded Oracle. Many other companies joined in the relational market, such as Microsoft, breaking IBM’s monopoly on database software.
Everything was dandy for a while, and database administrators got paid six figures to fine-tune the relational approach for big banks and oil companies and other enterprises. That is, until the Web came along, prompting the second big change to databases of the past fifty years.
Early in the Naughts, Google was struggling with a database problem of epic proportions: how to store and sort and sift data on all the billions of Web pages and their connections. Just as in the old IBM days, Google scientists were having to pay too much attention to the details of how data was stored across fleets of thousands of Google computers.
Google came up with some home-grown software that automatically divided up the Web indexing task across those fleets of computers while hiding the details from programmers. The result was a “parallelized” database that could easily sort and sift collections of data at unprecedented scale by treating many computers as one giant computer.
The Google approach was a hit, and numerous projects, both commercial and open source, built on its success. In 2009, Zaharia, then an assistant professor at UC Berkeley, got together with some colleagues to build their own enhancement to Google’s approach, called Spark. Google’s system was very good, they said, but its Achilles Heel was that it spent too much time looking for the data on all those disk drives inside all those thousands of computers.
Spark made one small but significant modification: it figured out how to keep more of the most-used data in fast DRAM memory, which can be read much faster than disk. Spark also altered the division of labor of Google’s proposal by assigning tasks to each computer depending on how close a computer was to the disk storing the relevant data.
The enhancements produced gigantic speed-ups, and Spark became a more-efficient way to run massive data operations on fleets of computers, perfect for the cloud era. “We made it easy for any data scientist or analyst to run things on these giant clusters of machines in a few seconds and do these interactive queries,” versus hours of waiting for results, explains Zaharia.
The deeper insight that Zaharia and colleagues had in building Spark was that if the database engine was efficient enough, then nothing specific about data formats needed to be imposed on the data itself, contrary to what Codd suggested. The problem of how to be astute about the location of data, they realized, was the essential problem of all data management. Solve it, and you could make any kind of database you wanted.
Zaharia and colleagues in 2013 gave Spark to the open-source world and went off to found Databricks to build a business on top of it. As Spark caught on in more and more places, the generality of its approach to handling data induced thousands of developers to write programs to run on top of Spark, to do things such as traditional database queries in the “SQL” language; massive scientific visualization projects; and machine learning forms of AI. Spark, and Databricks, became a very broad way to work with any kind of data for any kind of analysis.
In the modern database world, Snowflake’s data warehouse looks like the neatly ordered world of Codd, a manicured collection that is what enteprise IT expects. Databricks, on the other hand, building upon Spark, looks a bit wild and woolly, a pile of stuff in a data lake that you can turn into a warehouse if you so choose.
As Zaharia frames the competition, “The lake house, basically, gives you all the features you have in a traditional data warehouse but just on top of these massive volumes of data,” whereas, “Snowflake and traditional data warehouse is all based on the model that the first step is you've got to ingest data into it, and that converts it into an internal format, and then locks it up, and once it's in there, the only way you can access it is by writing a query to them [Snowflake].”
Being built directly on top of raw data means the lake house, says Zaharia, can plug in any kind of program, such as TensorFlow, the Google program for machine learning. Conversely, “If you store the data on Snowflake, TensorFlow can't access the data directly, it has to do a SQL query and wait for the engine to send all that data back, and you pay for the SQL engine to just convert the data to something TensorFlow can understand.”
By using the raw data in that way, the lake house will support more technologies, giving customers more choice. “If you're the chief data officer for some company and your job is to enable people to have the latest and greatest stuff on your data, you’ll probably choose the thing with the biggest ecosystem,” he says. “We have a very large ecosystem.”
The Databricks and Snowflake approaches have different economic aspects, too. Snowflake is prized by the Street because its customers sometimes binge on data, which leads to sudden upside for Snowflake’s quarterly revenue. Databricks, on the other hand, makes the sales pitch that the lake house way of doing things is more economical for customers.
“One of the things we do really well is we can run on large numbers of, kind-of, flaky cloud machines, at spot instance prices,” meaning, the cheapest offerings of compute capacity from Amazon AWS. A quick and dirty approach saves money, in other words.
Because Databricks is based on open-source software, the question inevitably arises as to whether it is an open-source company, the way Red Hat was.
Zaharia’s astute observation is that most everyone in the world these days makes money in one way or another off of open source.
“If you think about it, Amazon Web Services is by far the most successful open-source business in the world,” says Zaharia. All of EC2, Amazon’s main cloud computing product, he notes, “everything on there runs Linux, so they monetize Linux, there’s tens of billions dollars of revenue from Linux.”
Databricks, he says, is not exclusively an open-source company. Rather, “we're designing a business that is complimentary to all the great stuff happening in open source.” Open-source software is an historic movement. “We'd rather be on the right side of it,” says Zaharia.
There are other vendors of Spark as a service, including Amazon’s version of it. Zaharia’s belief is that his company is the gold standard.
“There's always going to be competition for things like this, but I think we definitely have the best place to run it,” he says. “For enterprises that just want to get things done, it makes a lot of sense,” he says, to go with an experienced service provider.
Databricks is not limited to Spark, which is one of the surprising things I learned talking to Zaharia. The deep insight buried in the genesis of Spark, remember, was that knowing where to place computing tasks in relation to disk and network is a general principle of using data.
“We don't have to translate everything to Spark,” he says. “A lot of what we do is simply the launching, managing of workloads, installing all the software on them, checking if they work well — that’s useful for anything you have to run, not just Spark.”
Databricks is increasingly presenting itself as a general system for distributed computing. Distributed computing has always been challenging to engineer. Even with Spark in the public domain, there is complexity that companies would rather pay Databricks to solve.
“A lot of the things you offer as a service, even if I gave you the code for them, an enterprise wouldn't be able to run it and get the same level of reliability and usability as getting it as a service from us,” says Zaharia.
Part of the Databricks business model has been to foster numerous innovations that partake of the same distributed approach as Spark but for more specific tasks. For example, MLFlow, developed at Databricks and made open source, is the most widely used program for what’s known as “MLOps,” industry jargon for running machine learning forms of AI inside a business.
There are large trends in data that are not limited to Spark and distributed computing, such as Kafka, the open-source software that lets companies monitor “streaming” signals from their applications in real time, millisecond by millisecond.
“Streaming is one of the fastest growing areas we see in terms of computation on our platform,” says Zaharia, “and it's already a significant fraction of what the platform does.” In addition to plugging into Kafka, Databricks supports the commercial implementations, such as Confluent, and Amazon’s Kinesis product. Databricks is “doubling down on streaming” with a lot of recent hires, he notes.
Examples of streaming include an online video game developer who had a fraud detection program. With Databricks’s help, the customer modified the fraud program to instead monitor real-time game play to detect abuse, cases where gamers were violating terms of play. Those are the kinds of modern millisecond apps that intrigue Zaharia.
“The ones I'm most excited about are the ones that actually affect the product,” he says. “They're not just analytics, they actually affect what the person will see next, or they ban the person who’s doing something bad or or whatever.”
Somewhere on the horizon is the prospect that Databricks could be a very, very broad platform to run enterprise computing across many public cloud computing services. This has been referred to in the trade press as “multi-cloud,” although I prefer my own term, “trans-cloud.” It incorporates the idea of having computing tasks that span a collection of computing resources.
“We believe that people will want to be able to move their workloads across platforms,” says Zaharia. “Our approach is, there are open APIs for your workload,” meaning, the programming interface, such as Spark, “and then there are things we can deliver as a service like reliability or performance or security features.”
My takeaway from all of this, as someone who follows public markets, is twofold. First, there is plenty of room for competition with Snowflake, which you wouldn’t think is the case from how richly valued Snowflake stock is, at twenty times next year’s expected revenue.
Second, I think there is a whole generation of infrastructure that is going to be built on top of public cloud computing that is just coming into view. People talk about the “move to the cloud” for enterprises as the final act of a very long play. It may be just the beginning, however.
Perhaps this is the third big change in the history of databases, a kind of primordial soup of information in the data lake out of which something structured arises that spans computing infrastructure.
If and when Databricks comes public, they would not be the only public promoters of the lake house, but I could imagine they would be a “pure play” on that new class of infrastructure.
When I was preparing to interview Zaharia, the main thing running through my mind was why Databricks hadn’t gone public already.
Zaharia’s reply, sensibly enough, is that his company’s market is still so much in flux, it’s not a bad thing to be private for the moment.
“We’re in a market that's fast-growing and also fast-changing,” Zaharia tells me. “We've still been innovating and creating new things, sometimes things that compete with the old way of doing stuff,” he observes. “That can be scary for public investors” who don’t like change, he says.
For example, Databricks recently introduced something called “Delta Live Tables,” which seeks to aggressively optimize customers’ jobs to reduce their costs.
“A public investor may look at that, and say, is this going to reduce your usage a lot?” The last thing investors want to hear is that your software is being used less. “But, as a private company, it's easier to say, yeah, this is the right thing, long term, for the business.”
He doesn’t lament missing the IPO window, he tells me.
“We’re in a great spot,” says Zaharia. “Our customers are growing quickly, year over year, the market is clearly growing.” During the slowdown that has hit some software firms in the most recent quarter, he observes, “it didn't really slow us down much, and in some cases, it actually is speeding up, the customer moving to a more modern architecture or consolidating more systems onto the lake house.
“So, I don't think we missed out on that much.”
Yet another shoe! AMD’s revenue warning is latest bit of chip exorcism
Oct 06, 2022
Surely the semiconductor market must be getting to its “trough” level, from which things pick up, but there are many, many shoes that have to drop before that happens.
The most recent warning had been last week’s presentation by memory-chip maker Micron Technology about what it called “an unprecedented confluence of events” that “has affected overall demand” for chips, especially in smartphones and personal computers.
That dour outlook followed Nvidia’s warning in early August that the video game portion of its businesses sustained a much larger-than-expected drop in demand in the July quarter.
And the latest bit of bad news comes to us this evening from Nvidia competitor Advanced Micro Devices, which announced after market close that its revenue for the third quarter that ended last month will be more than a billion dollars below what the company had forecasted back on August 2nd, a total of $5.6 billion versus the original forecast of $6.7 billion, give or take $200 million. Remember, that forecast was already disappointing at the time it was offered.
AMD shares declined about four percent in late trading.
The entire shortfall, said AMD, was in its group selling into PCs. “The PC market weakened significantly in the quarter,” according to AMD’s CEO, Lisa Su, adding, “While our product portfolio remains very strong, macroeconomic conditions drove lower than expected PC demand and a significant inventory correction across the PC supply chain.”
In fact, the drop in PC chips was pretty huge, a decline of forty percent, year over year.
The other parts of the business, chips for game consoles, for the data center, and for embedded computing, appear to be doing very well. Sales in the data center, for example, were up forty-five percent from the year-earlier period, and eight percent from the prior quarter. And while Nvidia had a rough time with its gaming products in its report, AMD saw sales of gaming chips rise fourteen percent.
The company’s not yet forecasting anything for the December quarter.
The story of a very weak PC market has been going on all year. And as I noted in July, the impact had already taken center stage as one of the main themes of earnings for the entire technology industry.
So, the PC falling apart is old news, and if you wanted to be an optimist, you could say that Micron’s warning brings everyone closer to the eventual bottom in the chip market, perhaps in the second quarter of 2023, as I argued earlier this week.
But every day of concern about potential recession makes the latest dire warning about PCs have fresh relevance.
AMD plans to give analysts more details during its formal conference call on November 1st.
With the after-hours decline, AMD stock is down fifty-five percent this year, and down nineteen percent since I picked it for inclusion in the TL20 group of stocks to consider.
What’s the next shoe to drop? Well, Intel will report results Thursday, October 27th, after market close, and we will see if things have gotten better or worse since the company delivered terrible results and awful outlook in late July.
Informatica CEO: one metadata to rule them all
Oct 06, 2022
There has been something of a renaissance in the past decade in the sleepy land of managing vast piles of information, the province of databases and related technologies. The renaissance started because of the rise of cloud computing roughly sixteen years ago, but it has taken on new and interesting forms as cloud computing has spread.
In the heart of those developments are prominent young, public firms including Snowflake and Confluent, as well as important young private companies such as Databricks and StarTree.
And then there are established companies that are transforming themselves in this riotous new age of data.
Amit Walia, the chief executive of software maker Informatica, was swinging through New York two weeks ago for a series of meetings, and was kind enough to invite me for a chat.
We had already talked in August about the company’s quarterly results, and how the company has changed since it came public a year ago after being taken private in 2015 by private equity.
This time around, I was keen to know just where Informatica fits in a constellation of rapidly expanding cloud software companies such as Confluent.
Within that renaissance of data, one of the most significant questions about how things take shape going forward is the question of what to do with what’s called the “metadata” of the cloud.
If the database is the information proper, the names of customers, the records or their product orders, and related information such as the addresses of suppliers, then metadata is like the card catalog in the library, for anyone old enough to remember those, an index that lists attributes about the data, such as where data physically resides, in what server, according to what category, etc.
All of the relevant companies such as Databricks want to have a hand in the metadata. The reason is that metadata is powerful in an era when data is increasingly stored in many, many places in cloud computing facilities. Metadata becomes a kind of essential catalog to tame the complexity.
For years, Informatica was a toolkit to help IT to clean up and prepare and migrate collections of data between different databases. Under Walia’s direction, the company has expanded from tools in the enterprise to an expanding suite of code that runs as a cloud service.
How does Informatica place itself within the efforts of all these companies eyeing metadata? I asked Walia when we met at Informatica’s satellite office in midtown Manhattan.
“It’s an apples and oranges conversation,” was Walia’s reply. “There’s metadata everywhere, data is fundamentally fragmented,” he says. There is something of a distinction, he implies, between metadata in general, and the metadata, the one true source.
“Our view is, How do I get a single view of the entire data by not putting it in one place, but by seeing the metadata?”
“Every organization has not only the old databases,” explains Walia. “They have [Amazon AWS] databases, [Microsoft] Azure databases, Oracle databases, Snowflake, Databricks, and none will ever have more than three or four percent of a company’s entire data.”
Walia’s pitch to his customers today is that his metadata software is like Google is for the Web, the index into all the disparate stuff that is being massively accumulated as “big data” grows and grows.
“There will be hundreds and hundreds and hundreds of repositories and sources of data, you will never have one place where all the data sits,” says Walia. “We want to be the system of record for metadata, bringing together all the metadata — that’s our vision for the next five to ten years.”
To that end, the Informatica software handles a heavy load. The company’s artificial intelligence capability, called Clear AI, processes eleven petabytes — a thousand trillion bytes — of metadata every day, he says.
Besides the analogy to Google, Walia uses the analogy to Switzerland in the complex world of the Web. Underlying the ambition of companies such as Confluent and Snowflake is the renaissance of software I mentioned, a fascinating soup of mostly open-source programs with playful names such as Hadoop and Spark and Kafka and Pinot.
With all these data offerings, Walia’s view is that the advantage for Informatica is precisely the difficulty of constructing a single “view” across whichever of the cloud programs his customers choose to use.
“There will be hundreds and hundreds and hundreds of repositories and sources of data, you will never have one place where all the data sits” in the cloud, says Walia. “We want to be the system of record for metadata, bringing together all the metadata — that’s our vision for the next five to ten years.”
“I’m not an open source company, but I support every open-source technology,” says Walia. That includes the widely popular program Spark, begun by founders of Databricks, but also all the derivatives of it. “I support open-source Spark, I support Databricks Spark, I support Azure Spark, I support AWS Spark,” and on and on.
“We do deep integration with Snowflake” via partnership, he adds.
“My job is to support all of them, the customer can pick and choose what they want.”
In fact, in the universe of metadata, Walia claims for Informatica a certain influential primacy over Databricks and others. After all, Informatica, now twenty-nine years old, had an extensive business as a public company years before these other companies were founded and years before Spark and Kafka and similar technologies emerged.
“We have been part and parcel of Spark,” says Walia. “Spark wouldn’t exist without Informatica,” he adds. “If we had not supported the development of Spark, it would have gone nowhere.”
When he talks about the competition, Walia draws a line between the open-source software he supports, such as Kafka and Spark, and the young companies profiting from then.
“Last I checked, there has not been a successful open-source company since since Red Hat,” he declares, the division of IBM that sells services for the open source Linux operating system.
I’m surprised by that assertion, and I point out that both Databricks and Confluent are nominally open-source software companies, and promising ones.
No, he counters, “Databricks and Confluent started as open-source resellers of Spark and Kafka” but no longer are open. Confluent, he says, is “completely proprietary, you can’t go back to open-source Kafka” once you’ve bought into Confluent’s service — a claim that I have yet to fact-check with Confluent, I should note.
From Walia’s point of view, as a catalog or index, his company surveys a vast, shifting landscape of different cloud software with a gaggle of smaller companies, some of whom compete with Informatica, some of whom, like Snowflake, are partners, mostly, for the moment.
Perhaps more meaningful than being a Google or Switzerland by analogy, the premise of Informatica is that by representing all the metadata of a company’s scattered databases, the company can help a customer gather what they need to do actual work with those far-flung resources.
“We are focused on solving mission-critical problems with mission-critical workloads,” he says, adding, with a bit of cheek, “we have never been focused on how crypto [currency] can change the world.”
As an example of mission critical, consider an inventory problem for a retailer, where product SKUs are in one of many databases or Spark collections. “I’m a Kroger, I want to figure out from all of the interfaces I have, point of sale, inventory data, what is the right amount of shelf that I need and how can I re-stock it from my warehouse,” Walia explains. “Because if I don’t stock enough, I lose sales, if I stock too much, I may have waste, and that’s cost for me.
“We are solving those mission-critical problems.”
Walia, who was the head of product development when Informatica was taken private, is inclined at times to dip into technical details more so than some other CEOs. In serving the most important applications, he tells me, Informatica’s software finds ways, automatically, to “push down,” an expression I had not heard before. It means to decide how best to tweak the code of a database request so it will run as efficiently as possible depending on which of those databases it is tapping, which can boost performance.
“What used to take Kroger a month to aggregate data, bring it all together, normalize it,” the various tasks of data cleanup, “we were able to do that in less than five days,” he boasts.
A job at a bank that takes an hour to run data, “we can cut it down to seconds.”
What happens when all his customers have moved everything to cloud computing? I ask Walia as my exit question. “That’s going to be at least another five to six years,” he says, “it’s not for the faint of heart.”
When it does, eventually, happen, “I think there’ll be a whole new generation of innovation that’ll happen after that,” he says, cryptically.
The other thing that will happen is automation, says Walia, via artificial intelligence. The scale of things is becoming enormous. That eleven petabytes of metadata his company’s software manages daily “is doubling and doubling,” he points out.
To automate things will require AI. “I mean, operational AI, not monkey-around AI,” he adds. Not programs to make statistical predictions, in other words, but, “a view of the metadata that tells IT, Hey, you’re running these ten workloads and three of them only run ten percent of the time, you could actually save a lot of cost if you make these changes.”
“The scale and the scope of the work will increase too much,” he says, “so, both intelligence and automation will be needed, and cloud will be the place where AI will play a big role.” And, by implication, Informatica will play a big role, too.
Informatica stock is down 41% this year, and eight percent since the earnings report, at a recent $21.64.
Nutanix CEO: Cloud supply and demand may be the key
Oct 04, 2022
It has been a confusing 2022 for investors in Nutanix, the company billing itself as the Airbnb of cloud computing. Happily, it seems the worst may now be behind the company.
The company’s drama started in March, when its sales forecast for the year ended in July came in just a tad above what the Street had been expecting, a so-so outlook. There was worse to come, as the company in May cut that outlook and had its first big miss relative to expectations in three years.
Up and down, what is going on here? Nutanix sells software that runs on top of Dell, HP, Lenovo and other companies’ server hardware, as a “virtualization” layer that makes those machines more efficient to use. Dell and the others continue to contend with the supply chain mess where they can’t necessarily get enough chips to complete and ship their machines. When Dell can’t ship servers, Nutanix can’t sell software
Nutanix is a step removed from what Dell and others are dealing with, like someone hearing about a fire down the block, with no certainty, and no control.
“March was fine, we didn’t see any supply chain issues, and then at the end of Q3, people were telling me they couldn’t get servers,” says CEO, Rajiv Ramaswami in a meeting we had via Zoom a week ago. “It was a bit of a surprise for us, we had to take down our outlook; we assumed it would get worse,” meaning, the supply chain situation, he explains. “And it did get worse, but not as bad as we anticipated,” hence, the positive forecast for this year.
Welcome to global business in 2022, where it’s just really hard to know what’s going on. The upshot is, as of right now, “we’ve seen it’s pretty stable, it’s not getting worse, but not better either,” says Ramaswami of the supply chain situation. Based on what Dell and others tell him, he is cautiously optimistic that things will “get better early next year” with the supply chain.
And so, in a world of uncertainty, what does a software maker do to stay on an even keel?
One important dimension to this fiscal year, recession or not, is a certain stability in the business model for Nutanix, represented by a giant pile of what are known as “renewals.”
The Street is simply dazzled by renewals. Nutanix shifted its business three years ago from selling traditional term license contracts to selling software as a subscription. The immediate effect was to put pressure on revenue growth, because subscriptions are usually smaller commitments by customers.
But the upside to subscriptions, over time, is that customers who bought the subscription are forced to renew periodically to keep using Nutanix’s software, kind of the way you have to keep paying Adobe to keep using Photoshop.
The reason the Street loves renewals is that Ramaswami, and his newly appointed CFO, Rukmini Sivaraman, have told to the Street that renewals are highly reliable. Nutanix doesn’t give out the actual number of renewals on a regular basis, nor their dollar amount. Instead, what it says is that the company’s “gross retention rate,” a measure of how many existing customers stick with the software, continues to be ninety percent or better.
In fact, Ramaswami has told the Street that the majority of revenue growth in this new, upbeat forecast for this year is going to come from renewals, without quantifying that. During a Goldman Sachs conference last month, he told Goldman analyst James Fish that the first subscription customers, from three years ago, hadn’t yet renewed. Subscription terms are averaging a little over three years. “So, that volume is sort of ahead of us, if you will, in 2023 and beyond,” said Ramaswami, meaning, a huge wave of renewals.
“We have a base of renewals, and the base of renewals is largely independent of the macro,” Ramaswami tells me during our Zoom meeting, “in the sense that our stuff is being used for mission-critical applications that are running, so customers will likely just renew — that business is pretty solid.”
Not only solid, but extra-profitable. Renewals sales are almost like free money in that they don’t require as much sales and marketing effort to generate. The cost of goods sold can be as much as eighty percent less for those deals, a huge boost to margins.
That’s important because Nutanix achieved positive free cash flow this past year, the first time it has done so since it switched to the subscription business model. “This is a milestone for us,” says Ramaswami. His intention is to keep things that way. “We are very focused on profitable growth,” he says.
With renewals in the bag, so to speak, all the uncertainty in the outlook for this year is the new stuff, selling new subscriptions. “We haven’t seen a slowdown in demand yet,” says Ramaswami, emphasizing the tentative quality. He is mindful that signing new business can become an issue depending on recession.
“The new business is new demand, which is where I think you are going to see macro uncertainty and slowdown being reflected,” he says. “That’s where we’ve assumed modest growth.”
It’s never a great thing to wait and see what the economy will do to your business, even if you’ve got a cushion of renewals. And so, Ramaswami is banking on the mission-critical aspect of what he sells to keep new customers coming in the door.
If there is an economic argument for Nutanix in a recession, or even a modest downturn, it is that the supply and demand of the public cloud needs a middleman, a broker, that notion of Nutanix as a kind of Airbnb of cloud.
Nutanix’s software started out as a way to make companies’ private data centers more like the public cloud. But being the Airbnb of cloud means increasingly selling software as a service that will run in public cloud facilities such as Amazon’s AWS and Microsoft’s Azure.
“We have a base of renewals, and the base of renewals is largely independent of the macro,” says Ramaswami of Nutanix’s big wave of coming renewals, “in the sense that our stuff is being used for mission-critical applications that are running, so customers will likely just renew — that business is pretty solid.”
Consider the laws of supply in public cloud computing versus private data centers. Cloud computing operators such as Amazon keep building, and need to monetize that growing capacity, while enterprises in tougher times will try to hold off on their capital expense by not building more data centers.
“Their business model” says Ramaswami of AWS but also managed service providers such as Equinix, “is that they spend a lot of capital ahead of time, they have this stuff sitting there, and then they bring in tenants, they bring in people to use it. Amazon and other providers are “very motivated,” he adds. “If they see their capacity filling up, they will go do more and expand more.”
Whereas “in enterprise, they don’t do that — they will spend the capital when they need it, not spend it way ahead of time” — especially in a recession.
For Amazon and Equinix and others, “they are motivated to partner with us because this is a way for them to get more applications into their data centers,” says Ramaswami.
Nutanix, in other words, becomes a broker of demand for the hungry cloud operators.
For the enterprise customer, the consumer, it’s all about smoothing their path to using public cloud as an economic solution to avoid heavy expenses.
The Nutanix software, by abstracting away the details of different cloud environments, makes it easier to move a given computing task from the corporate data center to the cloud, or from one public cloud service to another.
“There are three use cases that people have used us for in production,” says Ramaswami, “and in each of these cases there is an economic value proposition.”
One use is disaster recovery, where a company places backup copies of data in the public cloud, and only turns on computing if disaster strikes. “So, you’re paying for compute only if you need it — this is actually quite economically viable.”
Second is companies that want to expand their operations without building new data centers. A U.K. customer was running the Nutanix software in England in their own data center and wanted to start operations in Asia.
“Typically, they would have gone out there, leased capacity for a data center, buy hardware, and it would have taken six months.” Instead, the company subscribed to the Nutanix software as an app on Amazon’s AWS service. The customer was up and running in a month.
Companies such as Land’s End have done the same thing during seasons of heavy consumer demand, using the public cloud temporarily to expand their operations.
The third economic reason is just the plain old move to the cloud of everything, which is always going on, and which brings some economic benefits such as reduced capital expense for a company by renting computing.
Of course, it is by no means guaranteed that Nutanix will get the most business by being the broker to the cloud. Lots of other companies want to broker between supply and demand, including some of Nutanix’s partners, the hardware vendors such as Dell. The mission is to keep ahead of Dell and the rest, to be the top broker, or one of them, anyway, to not only sell easy renewals but to land new customers, and new partners.
Recently, one of Nutanix’s longtime competitors, VMware, said it would accept a buyout offer by chip and software conglomerate Broadcom. Ramaswami expects that to be potential edge for Nutanix because mergers create uncertainties for customers.
As Ramaswami puts it, reflecting on the VMware deal, “That’s somewhat to our favor, because I’ve seen there’s a lot more interest by customers how to manage their risk and uncertainty when something like this happens.”
Shares of Nutanix are down thirty-three percent this year, at a recent $21.33, and roughly flat since the earnings report on September 1st.
In Barron’s Advisor: Chips are not as bad as you think
Oct 04, 2022
Monday was an interesting day for chip stocks: they were some of the best performers, with shares of Nvidia, Intel, Advanced Micro Devices, and Applied Materials all rising four to seven percent.
That kind of bump is in defiance of constant worries about the chip market, including really bad headlines, such as Micron Technology’s forecast last week that missed by a mile.
But, the gains Monday are not so surprising, as I argue in my latest missive for Barron’s Advisor, published today. (Subscription required to read Barron’s Advisor articles.)
What has weighed on chip stocks most of this year is the cycle, the expectation that two healthy years will be followed by at least one, maybe two lousy years. By way of background, the chart below shows ups and downs in revenue going back all the way to 1976, curtesy of the industry consortium Semiconductor Industry Association.
The chart shows the peaks and valleys of the industry’s revenue annually, approaching what is expected to be fourteen percent revenue growth this year, a total of just under $620 billion, after a very healthy twenty-six percent growth last year. That may be followed by four percent growth next year, according to a forecast by the SIA’s collaborator, the World Semiconductor Trade Statistics organization.
Global annual semiconductor revenue from 1976 through 2023 in billions of dollars. Figures in green are forecasts for the remainder of 2022 and for 2023. Source: World Semiconductor Trade Statistics, Semiconductor Industry Association.
Now, everyone knows the numbers have to come down, because the WSTS is always late to call the downturn. However, the case I made in Barron’s is that assuming a downturn, things will be better than feared.
The main reason that will be the case is that the worst declines of past down cycles were an effect of over-supply. What is happening this time around is not a surge in supply but a decline in smartphone and PC sales, complicated by the ongoing supply chain issues.
A demand-driven decline like we’ve been seeing is not as serious as a supply-driven decline, primarily because it is easy for demand to return once an appetite for devices returns, whereas it’s really hard to get rid of a giant pile of chips that should never have been built to begin with. Demand is easier to fix than supply.
Demand will return, the product categories of PC and smartphone are not going away. We’ve seen this situation play out before. During the Great Recession, chip sales declined in 2009, but not as badly as the WSTS expected. Meantime, the stocks soared that year.
I surmise something similar is happening now as indicated in Monday’s rise. As the news gets worse, estimates get cut, and that brings chip investors closer to the bottom of things, and, consequently, closer to the turnaround.
When to focus on the numbers and when not to
Sep 30, 2022
Probably as an occupational hazard of being a reporter, I tend to approach stock investing with two minds, one that I would call qualitative, and the other quantitative. It’s a strategy that has evolved over years for me and it’s different from how some other stock pickers approach things.
On the qualitative side, I tend to approach things first based on the story of what a company is working on. That’s a narrative, and it’s qualitative, not quantitative. What is this company’s mission? What, if anything, is there of substance in the company’s technology? How astute is management? How does the company plan to “crush the competition”?
I tend to fall back on the wise words of Jim Barksdale, formerly head of Netscape and before that, president of FedEx. Barksdale, asked about the key to success in business, said it was all about “finding a parade and getting in front of it.” I tend to think that’s pretty true. The best companies such as Amazon and Apple found ways to insert themselves into markets and extract value by seeing what was coming into formation.
On a deeper level, it goes back to my overarching theory of technology. Technology is a loom, on which a great, never-ending tapestry is being woven. It is larger than any one company, and it runs the entire history of humanity. Companies are stronger to the extent they contribute to the threads of technology, and thereby put themselves in alignment with something very profound that endures.
Because of that focus on the story of technology and companies, I tend to evaluate companies less according to numbers such as this or that income statement line, or The Metrics such as “retention rate” and the like. Don’t get me wrong, those numbers are important, but to me, they tend to be trailing indicators. Long after a company has chosen a wrong path, and run away from the parade, if you will, weak numbers are the symptom.
When Meg Whitman took over Hewlett Packard in September of 2011, the company’s R&D spending was catastrophically low as a percentage of revenue, at 2.6%. But that was after years of wandering in the wilderness as one group of executives after another failed to really have a sensible mission for the company — what I called “more than a decade of multibillion-dollar blunders,” when I wrote about Whitman’s challenge for Barron’s.
It was, in fact, the numbers-obsessed who, cheering on Whitman’s predecessor, the late Mark Hurd, in his campaign of slashing costs to the bone, lead the company to such a low level of investment.
By the time Whitman came in, looking at the numbers told you what you might have already surmised from the narrative had you been paying attention years or decades prior. Bill and Dave’s company had long since ceased, and the numbers were just a skeleton crew.
On the flip side, with very young companies, the numbers are often a way to over-think a company’s potential. A young company that has built a business of half a billion dollars in annual revenue really hasn’t accomplished that much. The world is filled with companies of half a billion in revenue. Which is not to say they won’t be future stars. But those companies are not allowed, in public markets, to grow into their potential in a sensible fashion. Once public investors get ahold of such companies, every quarter is a quarter of obsessing over the financials to try and prove the company will definitely be a future titan.
Investors with spreadsheets, in a sense, are like overly competitive parents at a New York prep school who are trying to make sure their child, at age three, is “gifted,” and therefore destined for Harvard. There may, indeed, be gifts, but you’ll crush them by constantly imposing standardized tests.
So, I don’t really use the numbers to tell me about a company, I use my qualitative sense of the mission. That approach has the advantage that there tend to be fewer people on the Street who can follow a narrative than those who can generate a spreadsheet.
I have another part of my brain, formed from covering stock trading, that does rely on numbers. I use numbers to tell me not about companies but about investors.
Markets might be rational, I don’t know, but my sense is that numbers can often tell us how irrational people seem to be acting. For example, a year ago at this time, it was possible for me to write about how stocks that were darlings of the market such as Snowflake were trading at absurd multiples of future revenue, meaning, potential revenue that was years away from being a reality. Seems hard to believe now, as Snowflake today is no longer absurd, merely very expensive.
The point is, valuations in tech-land had gotten to an insane level where people were inventing justifications to buy things. The numbers were detached from a logical analysis of profits and were about contriving excuses, similar to how medieval astronomers kept adding complications to make sure their models showed the earth at the center of the universe.
In that case, the numbers were telling me that investors were in a rather crazy place.
In July, as I was putting together the TL20, the numbers were saying something different. Basically, all the whimsical justifications had gone out the window and people were dumping things regardless of fundamental financial potential. Yes, there was suddenly inflation, and rate hikes. But what was mostly going on was the end of a regime of justifying anything and everything by whatever means. All the prep school parents had decided their darlings were idiots because they failed a test.
And what was left were companies with still tremendous capabilities and tremendous markets, which were suddenly, in many cases, cheaper than they had been five years ago, even though they probably still had years of good growth ahead of them.
My bicameral approach, then, is that I tend to rely on the story about companies and technologies, and Technology writ large, to think about companies’ potential, and I rely on the numbers to tell me just how crazy investors might or might not be at any given time.
Does that work? We’ll find out with the TL20. But years of recommending stocks at Barron’s and SmartMoney, and writing about tech at many places including Bloomberg, generally showed me that a company that gets in front of a parade can be successful, and that buying when prices are low, and selling when high, can be a generally good approach to stocks.
During tonight’s conference call with analysts, following a terrible forecast, Micron’s CEO, Sanjay Mehrotra, emphasized the positives for Micron, such as a rock-solid balance sheet, but he, and CFO Mark Murphy, also used the word “unprecedented” a total of seven times to describe the challenge facing the chip industry.
Here’s the short story: the sudden drop-off in demand for PCs and smartphones, two markets making up just under half of Micron’s business for DRAM and NAND memory chips, have created a kind of V-shaped collapse in total chip demand. Customers of Micron are having to use up inflated inventories of chips before they buy any more. Demand should return in the middle of next year, marking a kind of bottom for the chip industry.
However, it’s all a bit clouded by the fact the global economic situation is volatile, noted both individuals.
“An unprecedented confluence of events has affected overall demand,” said Mehrotra, “including COVID-related lockdowns in China, the Ukraine war, the inflationary environment impacting consumer spending, and the macroeconomic environment influencing customers' buying favor in multiple segments.”
On the positive side, the company just concluded a year with record revenue, its balance sheet is stronger than ever; it is turning out more sophisticated kinds of chips than ever, speaking to its increasing technical acumen; and over half of Micron’s business is now in areas other than PCs and smartphones, namely, data center memory chips and automotive and computer networking and industrual applications.
“This portfolio transformation will increase our exposure to the most attractive and stable profit pools in the industry,” said Mehrotra of the business shift.
From an investor stand-point, the immediate bummer is that this quarter, Micron will have negative free cash flow of one and a half billion dollars, which means no buybacks, at a time when the stock is down forty-six percent and plenty of investors would love it if he could step in to support the shares.
Street response this evening focuses on grappling with the gloomy outlook.
From Robert Maire of Semiconductor Advisors: “It looks like we will be having a good old fashioned down cycle.”
This is going to spread to other chips, writes Maire. “The simple reality is that if manufacturers are buying less memory, they are buying less of other semiconductor types, its just that simple.”
I would just add that one has to keep in mind the incredible demand surge that brought us to this point.
It’s easy to forget that some of what is happening is a big cooldown from an extraordinary two years of binge-building that consumed a lot of memory chips, and all kinds of chips.
For example, in smartphones, although demand is on the wane now, as Mehrotra said, “despite the weakness in end unit sales, we achieved two consecutive years of record mobile revenue in fiscal 2021 and 2022.” Nothing lasts forever, things have to change at some point.
And while Micron is cutting its capital spending, it is also important to put that in context. Mehrotra said the conmpany still intends to spend forty billion dollars in the U.S. through the end of the decade to build manufacturing facilities, contingent on getting support from the CHIPS Act. In other words, there is still a lot of money going into building chip factories and buying equipment.
Funny enough, despite the gloom, Micron shares turned around in late trading, rising fractionally to $50.07. As I wrote on August 9th, numbers have been coming down for Micron for months now. Thursday evening’s gloomy news may finally be enough to push estimates so low for Micron that the stock can begin to appreciate again.
Earlier:
In the eye of the storm, selling a commodity product into a crumbling chip market, Micron Technology this evening reported fiscal fourth-quarter revenue that was lower than its already disappointing forecast
Revenue of $6.64 billion was below the range of $6.8 billion to $7.6 billion the company had offered back in June, which had been over twenty percent below consensus at the time.
The final number is below the much-reduced consensus for $6.7 billion.
All of this is in keeping with a warning that Micron offered on August 9th, saying that supply of chips was piling up as Micron’s customers worry about macroeconomic conditions.
On the bright side, the EPS number, $1.45, was above consensus for $1.37.
Micron shares declined one percent in late trading. The stock is already down forty-six percent this year.
The forecast for the current quarter is rather stomach-churning: revenue this quarter is expected to be $4 billion to $4.5 billion, way below the average estimate for $5.7 billion.
The forecast for profit, moreover, four cents to ten cents per share, is well below the average 69-cent estimate.
CEO Sanjay Mehrotra lead in his prepared remarks with a positive note, stating that t“In fiscal 2022, Micron generated record revenue of $30.8 billion and delivered our sixth consecutive year of positive free cash flow, allowing us to return a record $2.9 billion to our shareholders.”
Added Mehrotra, “our technology and manufacturing leadership in both DRAM and NAND, deep customer relationships, diverse product portfolio, and strong balance sheet put Micron on solid footing to navigate the weakened near-term supply-demand environment.
But, Mehrotra is also cutting capital spending, including tools to make chips, by half:
We are taking decisive steps to reduce our supply growth including a nearly 50% wafer fab equipment capex cut versus last year, and we expect to emerge from this downcycle well positioned to capitalize on the long-term demand for memory and storage.”
That will be not-so-great news to tools suppliers Applied Materials and Lam Research, both of which are down between two and three points in late trading.
In a deck of slides prepared for the report, Micron details how “Fiscal Q4 financial results were impacted by rapidly weakening consumer demand and significant customer inventory adjustments across all end markets.” No surprise there, with PCs and smartphones in free-fall for most of this year.
On the other hand, the data center market apparently continues to rage. Said Micron, “Cloud end demand remains healthy, driven by secular growth in AI and the digital economy. The market continues to face some supply constraints that are limiting server builds and macro uncertainties.”
The company said it expects both DRAM and NAND to be under-supplied by the industry next year, and it expects that while there’ll be a little bit less DRAM in 2023 than previously thought, the sales NAND flash chips will expand because they respond to price cuts (that’s nice for Charlie Giancarlo of Pure Storage to know.)
My trips to the Apple Store at World Financial Center in Manhattan the last couple weeks suggested to me there is a vibrant activity on the part of consumers checking out the new iPhone 14, and also buying stuff. It was hard to get waited on as staff were coming in and out of the back with boxes of stuff people were buying.
So much for my field research, which is always the most dubious form of perspective. You, like me, might have thought Apple’s doing pretty well with this latest round of stuff, but opinions vary, and the news flow has been less upbeat.
Bloomberg’s Debby Wu and Takashi Mochizukiwrote on Tuesday that Apple has told suppliers to halt an expansion of iPhone unit production and instead go back to its original production plan, citing multiple unnamed sources. Not everybody believed the report. Several Street analysts took exception, one, Ming-Chi Kuo of TF International, an oft-cited rumor expert, calling the story “weird.”
However, it is not hard to believe that with the prospect of a recession, a fancy new phone could be causing Apple to revise some production plans, even though the thing was introduced at the same price as the previous model.
So, this is what makes a market, and today brought dueling views on Apple’s stock: a downgrade from Bank of America/Merrill Lynch’s Wamsi Mohan, and an upgrade from Rosenblatt Securities’ Barton Crockett.
On the negative side, Mohan writes that “strong outperformance [is] once again at risk” for Apple, and he expects “material” cuts to Street estimates for the current fiscal year ending in September of next year.
Mohan cut his rating to Neutral from Buy, and cut his price target to $160 from $185.
The reasons include a risk to consumer buying overall, writes Mohan, and “stronger Pro-mix won't offset decline in rev/profit if overall units decline,” meaning, people drawn to the most expensive version of the phone for its slick camera won’t be enough if overall units are under pressure.
Which brings us to unit sales estimates. Apple no longer discloses its unit sales like it used to, but the Street still compiles estimates.
The current FactSet consensus for this year’s unit sales for iPhone are 245 million units, one million more than last year. Mohan was already a little shy of consensus, at 234 million, and has cut that to 219 million. His larger point is that even if the iPhone “Pro” units end up better than expected, at 132 million units versus the consensus currently for 110 million, nevertheless, with the same blended gross profit, Apple’s gross profit dollars will plunge to $83.7 billion from $88.5 billion because lower units will dominate how many profit dollars Apple takes in.
Mohan also sees a slowing services business already at Apple, and prospects not looking great. “App Store and Licensing (Google payments), which account for over 60% of Services, have incremental risk of deceleration,” he writes.
For the defense, Barton Crockett of Rosenblatt raises his rating on the stock to Buy from Neutral, and he hikes his price target to $189 from $160, the main reason being what he sees as very positive results of a survey of a thousand Americans he conducted this month via SurveyMonkey.
“7% of respondents said they had already ordered an iPhone 14, and 22% said they expect to buy one over the next 14 months,” relates Crockett. “That sums to 29% of our census representative survey of U.S. adults via SurveyMonkey, suggesting that 75 million of the 258 million adults in the U.S. say they want one of these phones.”
The U.S., bear in mind, is historically forty percent of the volume of sales for Apple for the iPhone.
Of course, those people can change their minds, he concedes, but “we read this as a constructive demand backdrop, especially given rising macroeconomic pressures.” Basically, Crockett’s survey seems to suggest what I gleaned from my highly unscientific trip to the store.
Among the tidbits in the survey is that people seem to be drawn by the phone’s “Emergency SOS” feature. That’s where you’re out hiking and you can use a network of satellites if you need help by texting a message to a command center via the satellite service if you don’t have cellular reception or a WiFi hotspot where you happen to be.
“The Emergency SOS feature that is new to iPhone 14 seems to have broad appeal,” writes Crockett, and “38% said this feature made them more likely to buy iPhone 14.”
Crockett notes also that forty-one percent of those surveyed said they had purchased the iPhone 14 Pro “Max,” the one with the 6.7-inch screen versus 6.1-inch for the normal Pro. (The language is getting tricky with these products, ‘the normal Pro’!) Of those expressing an intention, forty percent say they intend to buy the Pro Max.
Crockett dismisses the Bloomberg story in short order:
Bloomberg's report yesterday that Apple was walking back previous guidance to suppliers for a modest bump to 2H22 production back to an original target of flat should be read in the context of consumers' clear preference for the pricier models with higher ASPs. There is also a recent history of comparable reports proving to be misleading when actuals come out.
Crockett’s estimates for iPhone don’t include unit estimates. His estimate for this year’s total iPhone revenue, however, $217.19 billion, is lower than Mohan’s estimate for $220.5 billion. It’s also higher than consensus for $208 billion.
What that tells me is that the Street is currently not expecting as many pricey Pro or Pro Max units in the total mix, leading to lower revenue estimates than either analyst.
There you have it, two different views on the matter. I will offer a third bit to this, mostly unscientific. Below is a twelve-year history of iPhone debuts in one chart. Most of them have been positive, measured from the price just before the iPhone debut to fifty-two weeks later. (The years 2015 and 2019 each had an extra week.)
In magenta, you can see the performance of Apple stock since the September 7th debut of iPhone 14, not quite four weeks. Its downward trajectory is most akin to the downward trajectory in the first four weeks of the 2012 to 2013 year, which ended up being the worst on record.
Note, by the way, that each down year was followed by a very solid up-year.
Since this is unscientific, I would in no way presume that such a move means anything material about what’s going to happen this time around.
Pure Storage CEO Giancarlo: potentially a lot of large cloud deals down the road
Sep 29, 2022
Pure Storage is running circles around larger competitors such as Dell and Hewlett Packard, but the big prize may be sizable deals to replace the vast bulk of cheap storage inside cloud providers, says CEO Charlie Giancarlo during a meeting at the company’s satellite office in Manhattan.
Within a banner year for Pure Storage — a string of revenue beats, and really large EPS beats, and two increases so far to the year’s revenue outlook — the singular development that the Street was really taken with was the announcement this past spring that Pure won a large contract to supply technology to Meta Properties, owner of Facebook, for a major new artificial intelligence research computer, the RSC, or “Research SuperCluster.”
That announcement had caused the Street at the time to wonder, rather excitedly, if there is more of the same kind of deal in store with more cloud operators. After all, the deal had the effect of boosting Pure’s quarterly sales last October by several percentage points.
At the time that he was asked, Pure CEO Charlie Giancarlo was rather coy, but it seems there could, indeed, be more of such large deals in store.
“How many Meta-like deals are there? We think there are potentially a lot,” says Giancarlo in a meeting we had last week at Pure’s East Coast satellite office at 1 Penn Plaza in Manhattan. Giancarlo was on one of his quarterly swings through town from the company’s headquarters in Mountain View, California.
Without forecasting anything, Giancarlo took me through his thinking.
The Meta deal was for two distinct products from Pure, “FlashBlade,” and “FlashArray/C.” FlashBlade is for very high-performance kinds of applications, such as databases. FlashArray/C is for things with less-stringent demands, such as bulk file storage. The vast amount of the capacity Meta bought from Pure, ninety percent of the total one hundred and eighty-five petabytes (a thousand trillion bytes) worth, was for the more-economical FlashArray/C.
That sale for FlashArray/C had to beat out Meta’s internal teams that were using disk drives with custom software against Pure’s systems using NAND flash memory chips. Flash is generally better-performing thank disk but costs more money.
It turns out that the bake-off inside Meta is something of a model for future potential deals, says Giancarlo. He sees a prospect for many “hyper-scale” companies, the companies that run the cloud, such as Amazon and Microsoft and Google and Oracle and IBM, to replace their disk drives with better-performing flash memory chips in the form of FlashArray/C for little or no premium.
“Eighty to ninety percent of all the bits in the world are still on hard disks, and that’s true inside hyper-scalers as well,” explains Giancarlo. That vast trove of most of the world’s data constitutes what’s called secondary storage, or “near-line” storage, the stuff that is less performance-sensitive.
“What's really interesting,” says Giancarlo, “is that flash [memory chips] have been improving in price-performance, on average, fifteen to twenty percent a year for decades,” relative to disk, based on the cost-per-bit of storage in the two media.
“The intersection point, if you look at IDC data, between flash and disk, they estimate, will cross over in about four years,” meaning, flash becomes equal in price for the same capacity with disk. That is the “crossover point” that Giancarlo and I discussed back in December.
But for Giancarlo and Pure, the crossover point is sooner — it is right around the corner. “Our FlashArray/C, we feel, crossed over disk now, and with the next round of price reductions coming for flash, we feel we're going to be able to replace what's called the cheapest disk, the 7200 [RPM],” the stuff used for near-line, says Giancarlo.
The price reductions he is talking about are expected price chopping by Micron Technology and other makers of flash chips. Micron, as you may recall, has been in free-fall of late because the PC and smartphone markets have been breaking down all year, leaving excess supply of flash. The company’s forecast for revenue back on June 30th missed by a mile. Many expect that when Micron reports results tomorrow, September 29th, the results will again miss expectations.
Pure buys flash from Micron and others and combines it with Pure’s specially designed software to make highly-effective systems. If the memory market is in a sustained crisis of fall-off in demand, Micron and other vendors will have to cut prices on flash to move units, and that, says Giancarlo, will make Pure’s already economical FlashArray/C even more economical.
As a result, “We would like to believe that, maybe not every hyper-scale or every cloud company, but that many of them would choose to work with us at some level to be able to start replacing disk with flash.”
Giancarlo has another reason to think that may be the case: energy.
Pure touts not only the performance advantage of its systems but also their energy efficiency, even compared to other vendors of flash-based systems such as Dell Technologies and Hewlett Packed Enterprise, because Pure’s software and systems engineering uses the raw flash smarter than the competition.
“We have advantages on cost, we've advantages on the utilization of the asset, which lowers energy use and reduces the size of all of those things.”
Giancarlo is an enthusiastic traveling salesman — he has in past described his competition with Dell and Hewlett as “a knife-fight in a phone booth.” Not only did Giancarlo visit accounts in Australia and Japan over the summer months, he spent six weeks this past spring, he tells me, calling on customers in nine European countries, including the Bank of Ireland.
“What I learned is that, yes, power, of course, is important, but now they're afraid of power availability, and that's going to drive even more conservation than the cost by itself.” With Russia’s war in Ukraine, Europeans are concerned with a lack of fossil fuels, says Giancarlo — a sentiment that must be amplified by the report this week of the mysterious leak of the Nord Stream pipelines.
Now, the discussions Giancarlo is having with customers have shifted.
“I've been promoting at Pure lower space, power, cooling for my five years here, and for four years, nobody cared,” he says. “I’d go to customers and they say, Yeah, we don't pay for the power, that's facilities, we’ve got plenty of space, we don’t care about that, what’s your price?
“Now,” he says, “space, power, cooling — it's still an economic issue in the US, but in Europe it's, Are we even going to have the energy? So, of course we've got to reduce the amount of energy we’re using because we're afraid we might not even have it.”
When he calls on those European customers, Giancarlo has brought with him third-party analyses supporting the energy savings. “Fortunately, the facts tell the story, which is that compared to hard disk, flash is just so much more efficient, it’s literally more than ten-X,” he says, meaning, it cuts the necessary space, the power required, and the labor involved by ten times.
“We'd like to be part of the infrastructure of the largest” cloud providers, says Giancarlo. “And we think we have the opportunity to do that over the next several years.”
But compared to Dell or Hewlett or others, “we're somewhere between twenty percent to fifty percent of the power they use, so, we’re either twice as good or five times as good” from a performance-per-watt standpoint.
Pure is the only company that takes the raw flash chips and wraps special-purpose software around them rather than using commodity solid-state drives, the industry standard building block for flash-based storage. That allows Pure to have such low-level control of the use of the chips that it can wring all kinds of efficiencies out of the parts.
When does that energy advantage show up in deals?
“It already showed up in the Meta deal as a winning factor,” he says,
In the RSC, notes Giancarlo, “Meta are going to have an exabyte of storage in that data center, and they've got massive amounts of [Nvidia] GPUs.” What that means is that “they have to fit all of that in a physical footprint as well as a power footprint, and of course, they want to use most of the power for the GPUs.”
“We were the only ones that matched every one of those components,” he says, “that is, we had to get to the price right, but we were the only ones that would fit in the data center, both from a power and a space standpoint.”
Pure typically gets about a third of its revenue in any given period from cloud companies, but not necessarily the top three — Amazon, Microsoft and Google — but the top ten or twenty, says Giancarlo.
“We'd like to be part of the infrastructure of the largest,” he tells me, “and we think we have the opportunity to do that over the next several years,” mostly because the ninety percent of data still on disk needs to be rethought, and maybe because energy use is only going to become a greater and greater bugbear.
“As flash pricing comes down, they need to start doing something about their disk environment,” says Giancarlo of the cloud majors. “And we have the unique intellectual property that they don't have.”
I would add that, given the rising sense of concern in the industry about the sticker shock over the cost of cloud, it is possible that Amazon and the others could seek out Pure as a way to lower the storage cost for their millions of cloud customers, though Giancarlo is not banking explicitly on that prospect.
Pure shares, at Tuesday’s close of $27.66, are down sixteen percent this year.
The stock is also up six percent since I picked it for the TL20 group of stocks to consider in mid-July, making it one of the top performers of the TL20. When I showed Giancarlo the TL20 at our meeting, he expressed his pleasure at the company he keeps, not only Arista Networks, on whose board he sits, but also Applied Materials, a company of which he thinks very highly.
Sumo Logic CEO Sayar: The complexity of the trans-cloud is our friend
Sep 28, 2022
The debutante ball of an initial public offering is often followed a year or two later by a more substantial affair, the very first “analyst day” meeting, when management talks to the Street about its achievements as a public company and where its sees its business going.
You could call it the cotillion, when more is revealed than could be at IPO.
Last week, a promising young company was strutting the cotillion at the Nasdaq Market Site in midtown Manhattan, Sumo Logic, the twelve-year-old developer of tools for programmers, IT staff and security specialists.
At the Nasdaq, CEO Ramin Sayar, whom I had interviewed earlier this year, was kind enough to talk with me about the big reveal, and to place it in a deeper context.
The centerpiece of these analyst events is the multi-year forecast that a company’s management offers.
“It's better to set some goalposts because sentiment has changed dramatically,” Sayar observes as we make our way to a conference room.
Indeed, the mood has change among software investors this year from exuberance to an attitude not very hospitable toward software companies not yet making profit. Sumo is not yet profitable.
When first we spoke, in March, Sayar had insisted that CEOs and CFOs cannot let investors dictate their business strategy. That’s probably true. But these days, every CEO has to reckon with the changed environment. Setting goalposts is a way to mollify nervous investors.
When it was his turn to speak during the day’s presentations, Sumo’s CFO, Stewart Grierson, set the goalposts. It is expected that Sumo’s revenue will rise by almost seventy-five percent between this fiscal year ending in January and the fiscal year ending in 2026, to half a billion dollars, and that the company will go from a negative operating margin of about twenty percent to break-even.
Beyond just goalposts, the remarks by Grierson included pledges to bring about various “efficiencies,” things such as greater marketing and sales team productivity that will reduce the total sales and marketing spend as a percentage of revenue from forty-eight percent to as low as forty percent by 2026.
Things such as automation, said Grierson, can help drive down the company’s general expenses to just ten percent of revenue from seventeen percent today.
“You want them to walk away with those goalposts, and also understand what are the knobs,” is how Sayar describes the intention. Some knobs are easy to twiddle, Sayar indicates, like sales and marketing and some general expenses. “But, you can’t necessarily dial up the innovation and engineering investment and then dial it down,” he cautions.
Some CEOs are all about the numbers, but Sayar, himself, is not overly preoccupied with the financial knobs. A veteran of enterprise software over many decades, he is more intently connected to the mission his company is supposed to be fulfilling, what the actual software is supposed to do.
What was really to be conveyed at Nasdaq, he tells me, “is for them to understand that this is a must-have, not a nice-to-have,” they being the analysts, this being his company’s software.
Being essential, if that’s what it is, appears to be good for business. At a time when some companies have been warning of “push-outs,” delays in getting a software deal signed, “we didn't see, necessarily, any slowdown in the types of deals or sales cycles or opportunities” last quarter, says Sayar.
Just what is it that makes the company’s wares indispensable? It starts with the market need, according to Sayar.
Sayar is adept at portraying the frenetic energy of what is commonly referred to as “digital transformation,” sometimes referred to as “digitization” or “digitalization,” the headlong rush of companies to churn out code as they turn everything into an app.
“You and I are inundated in our personal and professional lives every single day with this dichotomy of work versus personal stuff on every device,” he explains. All that connects back to some stuff that sits in the cloud. “The back-end of that experience are cloud applications, and companies are not slowing down on this — spending on that is not going down.”
“There are seven hundred and sixty million cloud-native apps out there, and that's a fraction of how many will be out there” in future, he notes, and that’s just counting the newer apps that have been built, never mind the stuff that is from years ago that needs to be maintained and/or overhauled.
Sumo’s programs fit into the rush of app development by helping to monitor the app development process, to check for security violations, and to see how infrastructure is performing. In those functions, Sayar’s main point is that unlike competitors such as Datadog, New Relic, Elastic and Splunk, Sumo’s software can handle the way that the infrastructure of the cloud is becoming more complicated.
“You don’t have a static model of infrastructure,” explains Sayar, meaning the server computers and network switches and data storage. “Today it's here, five minutes later, it's there, it’s in a different pod, in a different region, different deployment.
“So, you have no context to whether that’s abnormal or normal because you've collected memory, disk, CPU, response time here and now it’s running on a different pod, different cluster there, you don’t have a baseline to compare it.”
Sayar is painting a picture of chaos. The only way, he says, to get a handle on that chaos, is “if you get traces, and you get logs, and you look for diffs, and you have to get in, crack open the payload of logs, and then look for abnormal behavior by cluster, by host, by location and correlate those,” a steady stream of the jargon of IT management.
It’s bewildering if you’re not in the flow of IT management. Some of it is clarified, some of it is made murkier, by a dip into patent filings.
The patents awarded to Sumo’s CTO, Christian Beedgen, and colleagues speak to complexity, claims to things such as a cloud-computing system to tag and categorize a constant flood of data.
U.S. patent number 9,633,106, awarded in 2017, is exemplary. It describes Alice and Charlie, two IT administrators who seem to be swimming in a sea of different kinds of mismatched data. It is poor Alice and Charlie who have to “crack open the payloads” and root around to find what’s wrong.
Somehow, the Sumo Logic technology described in the patent allows them to keep their heads above the task.
But don’t his competitors, such as Datadog, do that? “No, we do that,” insists Sayar. Those other programs “tell you the what,” as in, there’s a problem. They don’t, he asserts, “tell you the where, why and how.”
As much as Sumo is being propelled by the chaos of digitalization, it is also riding the wave of new infrastructure, programs such as Kafka, the open-source software that acts as a kind of firehose to move data in real time between applications.
Kafka is just one of a series of new infrastructure software programs that have emerged in recent years, such as Spark, that seem to be adding up to something larger and more important than the traditional cloud computing of Amazon AWS and Microsoft Azure.
I first termed it, two years ago, “The Great Cloud Rush,” a wave of companies built in the wake of AWS that aim to span multiple cloud computing networks, a cohort of, sort-of, meta-cloud companies, if you will. The term that has been used in trade magazines is “multi-cloud,” but it’s a rather dull term that doesn’t really do justice to the froth of complexity that is bubbling up above the cloud services.
I throw out to Sayar the term that comes to mind: trans-cloud. “It is something that spans,” he nods his agreement, something that stretches across data centers. The Cloud Rush, or the trans-cloud, as a kind of mess, should be a good opportunity for Sumo, I venture.
“It’s fantastic!” he replies. “Because guess what? That architecture, while that delivers the promise of scale, it brings complexity, and that complexity is what we've been trying to simplify,” the overarching mission, again, of Sumo.
Snowflake and MongoDB are examples of young Cloud Rush companies, companies building a business of databases on top of multiple cloud service providers. “But operational data is different” from database data, says Sayar. Unlike the neatly organized database tables that Snowflake or MongoDB manage, the operations data generated when apps have gone down, the problem of Alice and Charlie in the patent document, is a problem of the Kafka firehose spewing torrents of totally unorganized data.
The complexity means constant uncertainty.
What was really to be conveyed at Nasdaq, says Sayar, “is for them to understand that this is a must-have, not a nice-to-have” as far as the essential nature of his company’s software.
“Before, you knew you had a dedicated server,” in the corporate data center, Sayar reflects. “Today, it’s opaque, and that makes it hard to know the what, where, why and how.”
The project of Sumo, he says, "started from the bottom” of all that, building tools that “collect, reason, aggregate the data, to be able to create samples and patterns and algorithms and statistical correlation, and add in metrics and then events and traces all the way up to the top of the iceberg, so to speak, to show the ding-dong lights!”
The ding-dong lights are the little indicator in the Sumo Logic programs that tell the developer or the IT manager or the security specialist, You’ve got a problem.
The ability to handle all that complexity is what Sayar has had his sights on since he came aboard in 2014, he says, the ability to develop generations of product as a portfolio, not a single product. He calls that the “horizon” development model, where with each horizon, the company aims to broaden its addressable market.
“How do we go into these adjacent markets more broadly around monitoring, observability, security, IoT, and edge device computing and more,” is the challenge, he says. The company is currently headed to “Horizon Four,” says Sayar, in which so-called autonomous computing is the focus, more things happening automatically.
In the patent documents, the IT people Alice and Charlie still do a lot by hand. They must be exhausted. The future, says Sayar, is one in which analyzing programs is done by other programs in an automated fashion.
To spearhead the march toward Horizon Four, Sayar in May hired Tej Redkar as chief product officer, who had been a heavy user and evangelist of Sumo software at prior software companies.
Part of Horizon Four is to bring the various functions of app development and IT operations and security closer together in one product suite. The original goal, says Sayar, of Beedgen and his cofounders had been to simplify their own responsibilities.
“Remember how we got started,” says Sayar. “Our founders were frustrated with all the crap that was happening downstream to them as developers working on security software, because a lot of the headache that was happening downstream was happening upstream.
“There's this joke that development did the partying, security got the hangover,” recalls Sayar. That just means that various constituencies inside “DevSecOps,” as the rubric is known, can never agree on how to divide responsibility. “The reality is, in this world, because of the continuous nature of how things” — meaning, apps — “are pushed to production and updated, security can't be an afterthought.”
More chaos, in other words.
The Horizon Four work won’t be complete for another three to five years from now, says Sayar. It will hopefully bring ever-greater integration of those disparate roles and responsibilities, he says, to reduce the headaches for everyone. “That pain that our founders had was built on that premise of, I want to make my job easier.”
The reception to what happened at Nasdaq seemed to be largely positive. There were no changes in stock rating or price target, that I saw, but the goalposts were well-received.
Matthew Hedberg with RBC Capital Markets described himself “feeling good” coming out of the event, deeming the financial targets put up by Grierson to be “prudent” in light of an uncertain macroeconomic outlook.
Hedberg has an Outperform rating on the shares, and a $14 price target.
Blair Abernathy with Rosenblatt Securities, another bull on the stock, wrote that he “came away from the event encouraged that Sumo has now set a clear, achievable path to profitability.”
Sumo stock, at a recent $7.20, is down forty-seven percent this year, and about that much since its IPO in September of 2020.
A message from the editor
Sep 27, 2022
Tiernan Ray, creator and editor of The Technology Letter, takes you through his thinking about the group of twenty stocks, the TL20.
Qualcomm shows off its automotive chip chops
Sep 23, 2022
CEO Amon, who is naturally outgoing, and who spent years in the trenches engineering Qualcomm’s technology before ascending to the top spot, went into extra innings fielding questions with vigor and acumen.
Qualcomm chief executive Cristiano Amon on Thursday gathered his team at the Classic Car Club beside the Hudson River in Manhattan for several hours of presentations to the Street and press about the company’s expanding opportunity for chips in the automotive market. It was the first time Qualcomm has done a dedicated day just to talk about cars apart from its other businesses.
The presentation was encouraging. The company increased some of its forecasts for revenue from cars for the next several years, and Amon, who is naturally outgoing, and who spent years in the trenches engineering Qualcomm’s technology before ascending to the top spot, went into extra innings fielding questions with vigor and acumen.
A ream of details on products was provided by Nakul Duggal, Qualcomm’s general manager of the automotive division, specifically the three areas that are the company’s current focus: the connectivity, such as the modem for car-to-Internet connection; the “digital cockpit,” a rubric covering all manner of things that happen with the dash, the central stack, the backset entertainment, and on and on; and “advanced driver-assistance system," or ADAS, all the technologies that will make a car, someday, drive itself, or so they say.
CFO Akash Palkhiwala rounded out the presentation with several significant financial updates that seemed to be very well received by analysts at the event.
The key stats are as follows. The company’s revenue from automative is expected to rise by thirty percent in the fiscal year ending this month, to $1.3 billion. That’s out of total company revenue of forty-four billion, so auto is still at the starting line as a part of Qualcomm’s revenue, but rising fast.
The company estimates its “addressable market” for chips in cars at one hundred billion dollars annually by 2030. That’s assuming Qualcomm makes $200 to $3,000 per car.
The company’s “primary KPI,” or “key performance indicator,” is the “design-win pipeline,” the total value of expected future contracts with manufacturers, based on the lifetime of the deal for that part in that car, assuming certain volumes of car shipments. That figure is currently thirty billion dollars, up from thirteen billion mentioned at the investor meeting Qualcomm held in November of last year.
That number doesn’t include some major manufacturers that Palkhiwala intimated will come aboard in coming years, without naming names.
And the current view into revenue has taken a meaningful jump upward from the November meeting. At that time, Qualcomm forecast three and a half billion dollars in auto revenue, annually, by 2026, and now it’s forecasting over four billion. Its estimate for fiscal 2031 has risen from eight billion to now over nine billion.
Compare that nine billion to the hundred-billion-dollar forecast, and you can see that Qualcomm is expecting to have roughly ten percent of the market by 2031.
An impressive aspect is that the new, increased revenue forecasts don’t include some parts of Qualcomm’s business that are already in development with customers. For example, RF chips, the technology for sending the signals from the cockpit wirelessly to cell towers. Qualcomm is in discussions on selling those parts, the executives said, without disclosing details. They would represent additional future revenue.
Amon, center, with CFO Akash Palkhiwala, left, and Nakul Duggal, general manager of the automotive division.
Also, 5G wireless networking, once it is turned on in cars, which will be some years yet, is expected to add five dollars per car in licensing fees for Qualcomm’s intellectual property division, Qualcomm Technology Licensing. That is very high-margin revenue because it’s almost all profit. The company has over fifty contracts in place for such licenses and expects the deployment will become material starting in 2024.
Even more impressive for investors is that Qualcomm has ninety percent of the forecast revenue for the next four years already accounted for by “design win” contracts.
Amon summed up the talks by declaring “it’s probably graduation day for the Qualcomm automotive business as we position ourselves and establish partnerships to be one of the largest automotive providers of technology for the future of automotive.”
By way of reference, Nvidia, which has for a long time sold a lot of stuff into cars, and which has big ambition for cars as well, trails Qualcomm with a projected nine hundred million in revenue this fiscal year.
I would have enjoyed more discussion of Qualcomm versus Nvidia because I find it fascinating. In particular, ADAS is the newest area for Qualcomm, and it is the area where I expect to see the greatest head-to-head competition with Nvidia in years to come.
What Amon did say was that Qualcomm is taking a much broader approach than Nvidia, trying to cover more of the aspects of Detroit’s profound changes.
“The goal is to help with the auto industry's transformation,” said Amon. “So, our approach is actually not specific to a domain or a feature or what are you going to do to deploy this specific feature, it is actually building a partnership, understanding where flexibility is needed […] and dealing with that complexity.
“So, it's a very different approach than say [Intel’s] Mobileye that is focused on ADAS, or an Nvidia that is focused on a specific technology and its application into the automotive space.”
Among the questions, I noticed a certain skepticism of the hundred-billion-dollar market size that was offered for 2030.
Analyst Gary Mobley with Wells Fargo noted that some market research firms have different estimates. Said Amon, “You can find a lot of different forecasts that give you a different set of numbers, but our framework is the following: we are negotiating regularly with these OEMs on different platforms, and we have a very good sense of two things, first, what they plan to deploy in their cars, in the short term, and then, second, how they plan to scale it differently over the long term.”
That makes sense. The company is actually working in this area, unlike a lot of market research people who are simply driving spreadsheets.
The venue for Qualcomm’s meeting, the Classic Car Club, has a hanger full of neat-looking stuff.
The most important thing about those conversations with customers, in my view, is that they appear to be at a deeper level than simply as a parts supplier. Said Amon, “You see now a senior level of engagement, in the number of CEOs that we have been engaged to in this industry, because they have to make the right choice on the platform, and then build on top of the platform. And that's very different with how the industry used to be.”
That makes sense, too. The challenge of how to make something much more gigantic than a smartphone, where there is no Apple and no Alphabet dominating the basic gestalt of the car, means that for the moment, these car companies are turning to Qualcomm for a close kind of collaboration to define what is to be built from step one.
That sounds like a good place to be.
Qualcomm shares are down fifty-seven percent this year and down fourteen percent since I picked them for the TL20.
Is Nvidia serious about software?
Sep 22, 2022
Huang says running software to rent in the cloud will be “very long-term SaaS platforms for our company,” but does he mean it?
Nvidia has become the biggest chip vendor in the world, at three hundred and thirty-seven billion dollars in market capitalization, by dint of the fact that none of the competition have managed to come up with chips sufficiently superior to crack the company’s hold on the market for the most cutting-edge applications in data centers, especially artificial intelligence.
There is now a prospect of another interesting realm for Nvidia to exploit: software.
This week, Nvidia held another one of its “GTC” conferences, where it touts lots of new products. While the event was full of stuff about new chips, there was also the announcement of two cloud computing services that Nvidia will own and operate.
One service is a way for companies to collaborate on 3-D design via the cloud, based on Nvidia’s “Omniverse” technology, which is its take on The Metaverse, the mostly non-existent something that Meta’s Mark Zuckerberg has touted. I gave a preview of the idea of this service in a recent article.
The other service is a way to run very large AI programs in the cloud without a lot of the owned infrastructure or data science work, especially programs for handling natural language processing, which are becoming important tools for companies.
I covered the announcements for ZDNet. But, beyond the press releases, what interested me was the business question of whether Huang is really serious.
The company is planning to run Nvidia-based computers on behalf of its customers that will let them do some of what they would do with Huang’s chips by renting computingin the cloud rather than buying those computer systems outright.
It’s the kind of thing that makes me wonder, Does the chip maker want to merely juice the market for its product versus really committing to a new line of business?
And so, when I took part in a press conference Wednesday morning, I asked Huang directly, How big can this software stuff be for Nvidia over many years?
“Well, it's hard to say; that's really, kind-of, the answer,” was Huang’s initial reply.
But then, in true Huang style, he went on to ruminate at length about the importance of the two services, AI and Omniverse. Huang is not only ambitious and extremely smart, he is genuinely intellectually stimulated by large questions of where markets and technology are headed.
In my second ZDNet article, Wednesday, I printed Huang’s full response to my question. To summarize it for you, he reflected that the use of large AI programs of the kind he plans to offer in the cloud are becoming essential to every business on the planet, and so, running them as a service “is potentially one of the largest software opportunities ever.”
On the matter of Omniverse, while much of The Metaverse is just complete vaporware, in my view, Huang had an interesting technical point that opened my eyes somewhat. He emphasized that the service he plans to run is a database.
“It’s a modern database in the cloud, except this database is in 3-D, this database connects multiple people.”
Now, that is interesting. A fantasy world of people running around as avatars, having adventures, the way Zuckerberg describes, strikes me as a pipe dream. But the database market is a real market, a multiple-hundreds-of-billions-of-dollars market, if you count all the stuff sold by Oracle and IBM and Microsoft and Amazon and MongoDB and Couchbase and many others. A database in the cloud sounds like a real application and a real market.
And so, without predicting success for Huang, I would say that his technological soliloquy was thought-provoking. I have noted in past that Nvidia was moving in the direction of selling software in the cloud. Over a year ago, I wrote, “now, Nvidia is selling a cloud where people can come and rent AI capability, provided by Nvidia in conjunction with a partner, data center operator Equinix, called “LaunchPad.”
LaunchPad, however, truly seemed like a starter kit, a demo to convince people to try and then buy Nvidia chips. The remarks by Huang on Wednesday, in contrast, tilt things just a little bit in the direction of having an actual commitment to a software business.
Huang concluded by saying, “these two SaaS platforms,” meaning, software-as-a-service, the industry rubric for running programs in the cloud, “are going to be very long-term SaaS platforms for our company, and we'll make them run in multiple clouds and so on and so forth.”
I would note the Street’s reaction this week ranged from outright skepticism to mild — very mild — enthusiasm.
The most enthusiastic note I saw came from Rajvindra Gill of Needham & Co., who has a Buy rating on Nvidia shares. Gill seems to believe software can really be something over time for the company.
Writes Gill,
Longer term, NVIDIA's push into SaaS-like offerings through Omniverse can be a differentiator. Their current software run-rate is “a couple hundred million dollars.” We expect the product-market fit for recommender systems and large language models to drive increased hardware and software (“full, vertical stack”) adoption.
The jury is still out on whether software is going to be a big business for Nvidia. If it were to become such, the effect would be to give Nvidia’s shares potentially a higher valuation multiple, although that is hard to imagine given that Nvidia stock is one of the most expensive chip stocks around.
Meantime, back on planet earth, the question at the moment for Nvidia is how big a deal the latest restrictions are on Nvidia’s sales to China. On August 31st, the company said in its quarterly filing that new restrictions by the U.S. Department of Commerce on export of some Nvidia chips to China will cost the company four hundred million dollars in revenue.
Some have opined this may not be as big a deal as thought. For example, Hans Mosesmann of Rosenblatt Securities on Tuesday wrote that “We speculate that the licensing demands by the Dept of Commerce are meant to slow-down potential upcoming or ongoing programs in the target regions (which this will do but not by much), and to annoy,” rather than being a broad ban on sales.
On Wednesday, Huang was asked at the press conference about the bans. His response was that the restrictions do not shut things down so much as perhaps create extra hoops to jump through for Nvidia and its customers in the country, without being a disaster:
You can't have completely open, unfair trade, you can't have completely unfettered access to technology without concern for national security. But you can't have no trade … And so, I think it's just a matter of degrees. The limitations and the licensing restrictions that we are affected by gives us plenty of room to continue to conduct business in China with our partners, gives us plenty of room to continue to innovate and to continue to serve our customers there. And in the event that the most extreme examples and use of our technology is needed, we can go seek the license.
Nvidia stock is down fifty-five percent this year, and down sixteen percent since I picked it for the TL20.
Zuora CEO Tzuo: Media giants are learning not just to explore but also to exploit
Sep 22, 2022
In the realm of evolutionary psychology, there is a notion of two paths to survival: explore and exploit. You’ll never know what the environment offers you if you don’t first explore your surroundings. But constantly exploring means ignoring the things you’ve found. At some point, you lean on exploiting what is already known. Survival is a balance of both.
What is known as the “subscription economy,” a new form of sales in which customers buy into an ongoing relationship with a product maker, has entered a new phase in that balance between explore and exploit, according to Tien Tzuo, who is the founder and CEO of Zuora, which makes applications for companies to bundle, price and collect for those subscription-economy offerings.
“The subscription economy is shifting into a new phase,” Tzuo tells me in an interview we had recently via Zoom. “It used to be all about acquire, acquire, acquire, which Netflix did” says Tzuo, referring to the massive growth of the streaming music enterprise that came to a screeching halt last year. Netflix was the classic example of explore for the past several years.
However, that game has now become rather passé, says Tzuo.
“Disney acquired the same number of customers as Netflix in a fifth of the time,” he observes of Disney’s own streaming video offering. “It took Netflix ten years, it took Disney two.”
In other words, the first phase, acquiring customers, has become something of a dead end. There has to be something else or a company that’s just acquiring will run aground like Netflix did.
Disney is an example of a company that is figuring out the second phase of subscriptions, says Tzuo, where a prospective subscriber is more ingeniously presented with increasing amounts of cross-selling and up-selling.
Disney, and other companies in media and publishing, says Tzuo, are increasingly regarding their business as exploiting what they’ve got as much as simply trying to lure new perspectives.
“Look at what is happening in media and publishing, it’s all about managing these subscriber journeys, how do you convert an anonymous customer to a subscriber?” explains Tzuo.
The word “journey” is one of those ill-defined euphemisms that really implies what was traditionally known in the cable industry as the “lifetime value” of a subscriber. The Netflix model for lifetime was a simple straight-line calculation based on a straightforward pricing model, and that was fine for as long as new subscribers were easy to obtain.
Nowadays, finding new value in that subscriber is the challenge.
The New York Times, for example, which is a customer for Zuora’s software, has said that at its peak, one out of every two Americans went to the Times Web site or app to find out about the Coronavirus. And the Financial Times had a huge surge with free articles on Brexit back when that was the dominant headline, Tzuo points out.
“So, the newspaper companies are realizing, look we can create a bunch of new visitors but then the game begins afoot: How do you convert the anonymous users to some subscribers, how do you cross-sell and upsell to more subscriptions?”
A company such as The Times might say “you’ve been playing Wordle,” the Times’s daily free puzzle consisting of word-guessing, “then maybe you will sign up for Spelling Bee,” a Times game where you construct as many words as possible from a fixed set of letters. Spelling Bee is not free; it requires a paid subscription to either a Times Games package or to print or online Times editions. The point is, people are induced to move from free to paying.
Or take Disney, he suggests. “You have Hulu, you sign up for Disney+,” the streaming service, which would bump a subscriber from $6.99 a month to $13.99, or even $19.99 if they choose to avoid commercials.
What, you may wonder, does this have to do with Zuora? If his customers are willing to dig more deeply into subscribers’ passions, and pockets, Zuora also has an opportunity to sell more to the same customer base.
For that reason, Tzuo is on the lookout for things he can add to his quiver. the company on its earnings call last month announced it paid $45 million in cash to buy London-based startup Zephr, which sells tools to help companies monitor and analyze that experience of which Tzuo speaks.
The purchase was part of a “war chest” as Tzuo calls it, from private equity firm Silver Lake, which earlier this year agreed to purchase four hundred million dollars worth of Zuora convertible debt expiring 2029.
“They were an existing partner of Zuora,” Tzuo says of Zephr, “they are focused on a key vertical [market], the fastest growing one, the media and publishing vertical,” precisely the companies such as The Times and Disney who are being most aggressive about exploiting and finding greater subscriber value.
The tools of Zephr will be another product offering for Zuora alongside the tools to bundle, price and collect from subscribers. It will be a tool to mine what draws subscribers, to further draw them into additional offerings.
“We welcome all customers,” says Tzuo, but, ultimately, “the subscription economy is a scale business,” the formula for “a thousand customers each building a billion-dollar subscription business — call it, a billion dollars per year — through us.”
“We like the joint vision of the subscription experience — that is very much part and parcel of us,” says Tzuo of the kindred viewpoint with Zephr. “We are morphing from just a monetization platform to a monetization-plus-experience platform — going deeper — seeing our customers’ customers, what they see — the subscribers.”
There is a symmetry here, to find more lifetime value in Zuora’s customers as those customers try to mine their own customers.
Customers such as The Times are the vanguard to Tzuo. “Media and publishing are a little farther ahead” in mining the value of the customer journey, he says, but he expects that some day, all companies, even hard goods manufacturers, will follow suit, selling everything as a subscription to subscribers who can be more and more deeply mined.
“To be fair, today it [Zephr’s technology] is focused on the media publishing industry,” he says. “We do believe that the fundamental technology for Zephr is broadly applicable.”
To Tzuo, the focus on TheTimes and Disney and their ilk is part and parcel of what he has pledged to do. Last year, Tzuo changed the company’s approach to how it markets and sells its own programs. The new “model” he adopted, what he calls the “multi-product strategy,” meant working patiently with customers to let them add more of Zuora’s programs over time, as the customer’s needs increase.
Sometimes, that means patiently building a relationship till the moment is ripe to for a sale — similar, he believes, to what his customers such as The Times and Disney are doing with their customers.
The result of having more to sell can be seen to an extent in what I call The Metrics, the non-GAAP numbers that the Street obsesses over with software companies.
Specifically, the metric known as “net dollar retention rate,” the percentage of money existing customers spend each year relative to the prior year, is a reflection of the ability to sell more and more to the same customers. That figure increased last quarter by a point, year over year, to 111%, closing in on a long-term goal of 112% to 115% retention rate.
As important as a rising retention rate is the number of customers of a certain size. Zuora’s customers spending one hundred thousand dollars or more with Zuora, the ACV, or “annual customer value,” declined last quarter by one customer from the prior quarter but it was still up when comparing year over year.
Moreover, the number of customers spending half a million dollars or more with Zuora rose by twenty-five percent.
When I ask Tzuo which is more important, explore or exploit, he replies, “It’s a great question; to some extent, it’s a judgment call.”
Zuora is “not a VC [venture capitalist] deciding on who’s the right company, we welcome all customers,” but, at the same time, “we believe the subscription economy, ultimately, is a scale business.”
“We said eighteen months ago we would focus on the biggest-investing companies in the world that are committed to building billion-dollar subscription businesses,” he explains.
“It’s not as simple as who has the most subscribers wins, there can be multiple winners,” he says, including boutique businesses much smaller than Disney. “But if you have more subscribers then you have more revenue to invest in creating what the subscribers are hungry for.
“It certainly could be a startup like Zoom, it could be a large business like The New York Times saying we are committed to subscription businesses, we’re looking for the right technology, the right partner to help us get there.
“We like the idea of companies that are capable of building scale businesses,” which, generally, “requires them to be an incumbent large company, or a well-backed venture startup that has passed that product-market-fit milestone,” he says. Companies that know their customers, in other words.
“We are building our company to be the partner for those type of companies to be successful.”
Having large customers increasingly spending more “feels good,” he says. “A thousand customers each building a billion-dollar subscription business — call it, a billion dollars per year — through us.”
Churn, he notes, the rate at which Zuora’s customers go away, last quarter was the lowest it’s been since the company’s April, 2018 initial public offering. While Zuora doesn’t disclose the actual number, it’s an encouraging direction, if one assumes that finding more and more value in existing customers requires, above all, retaining customers .
The Metrics reported in the August 24th fiscal second-quarter report showed what you could say is a steady glide. Not only was the dollar-based net retention up a point, but, notes Tzuo, “our ARR [annualized recurring revenue] would have been up one percent” if not for foreign exchange rates. ARR is an average of monthly sales extended out to a twelve-month period. It rose by twenty percent, equal with the previous quarter’s growth rate. Around a third of Zuora’s customers are overseas, so the rising dollar is expected to continue to reduce reported metrics such as ARR when converting from Euros and other instruments.
And, subscription revenue, which is the vast majority of Zuora’s revenue, “continued to rise,” he notes, albeit more slowly, at seventeen percent growth versus twenty-one percent in the prior quarter. Without the foreign currency hit, it would have been up nineteen percent.
“Overall, things feel really, really good” is how Tzuo sums up the quarter.
This was the tenth quarter in a row the company beat revenue expectations, though Tzuo and CFO Todd McElhatton took down their view of the remaining two quarters’ revenue because of the dollar’s expected increase.
The response was a six percent drop in the stock the next day. The bulk of that reaction may have been a little bit of confusion about what McElhatton told the Street about collections. He said some customers slipped in their payments to Zuora.
“During the month of July, we observed some collection timing pushing out by a few days, on average,” said McElhatton, meaning, how promptly Zuora collects from customers.
I point out to Tzuo that many companies these days have been talking about “deal push-outs,” or “extra scrutiny” in software deals. Does a slip in some collections presage something serious about a worsening macroeconomic situation?
“We are not seeing collections being a huge issue,” he tells me.
The company, he says, “wanted to give the Street the best information.”
“What we’re seeing, basically, is at the end of a quarter, if you are going to start withholding a few payments because you’re nervous, and you push it out by a few days at the end of the quarter so it runs into the next quarter,” then other parties will respond, he says, saying, “Shoot, well, maybe I need to do the same thing — you get this chain effect or herd mentality,” explains Tzuo.
That sounds ominous, but, says Tzuo, “If you look at the number of collections that are over sixty days, you don’t see any rise, you see a really small percentage, so our customer base is paying.
“We wanted to flag that it might be an impact at the end of our fiscal year, we might have a cash impact.”
Adds Tzuo, "You know, part of me feels like it was a little too much information; if people are trying to say, Gosh, is it some kind of early signs of recession, that’s not what we were saying, we’re not seeing that.”
Tzuo is a seasoned software veteran. He cut his teeth working first for Larry Ellison at Oracle, and then as head of marketing at cloud giant Salesforce. And so, I ask if there’s anything he’s seeing that looks to him like patterns from past economic downturns.
“I would say that the [software] industry right now is really worried about a slowdown in IT spend,” says Tzuo.
“I would say the picture is more that there isn’t any indicator of a slowdown, but there is more scrutiny of IT spend, and so you have to have something that’s valuable, and tied to the company’s strategic growth plans and we do believe we are” strategic, he says.
You’ll hear “more noise” from individual companies, observes Tzuo, meaning, this or that software vendor with issues, “but the overall trend is, we you don’t see a pause in spending in what we do given the macro environment.”
Zuora stock is down fifty-nine percent this year at a recent $7.66.
In Barron’s Advisor: the right price for growth
Sep 16, 2022
One can mix various metrics of value and growth like tuning the sound of a stereo to get a more balanced portfolio.
In my latest missive this week for Barron’s Advisor, I’ve tried to provide a coherent approach to reconciling growth and value in tech stock picking. (Subscription required to read Barron’s Advisor articles.)
The crux of the piece is that one should focus on valuation multiples that incorporate projected growth for a company, which I refer to as “valuation-weighting” a portfolio. On a simpler level, you could say it’s just screening stocks to see which are both cheap and have above-average growth.
The subtler point I wished to convey, which I may or may not have succeeded in, is that one can dial the measures of value and growth as if they are the controls of a stereo mixer to find the right balance in a portfolio of growth and value.
My approach, which I began before Tuesday’s really sharp sell-off, was to look first at which stocks out of hundreds had above-average expected revenue growth, and then ask which of those high-growth names cost the least for that growth.
By way of example, one of the stocks that emerged is one of the names that I picked for the TL20, Snowflake.
Of course, Snowflake stock is a lot cheaper now, but it’s still expensive by many measures. Its multiple of enterprise value divided by the next four quarters’ projected sales was 22.3 times based on last Friday’s close, a little less after Tuesday’s sell-off.
However, Snowflake also has among the highest rates of projected revenue growth, estimated at fifty-four percent over those forward four quarters, quite a bit higher than an average growth rate for hundreds of U.S.-listed tech names of just eleven percent.
When you divide Snowflake’s revenue multiple of 22.3 by its revenue growth rate of fifty-four percent, you get a “price-to-sales-to-growth” multiple of 0.41 times. Put another way, the premium you pay for every dollar of revenue over the next twelve months is less than half the rate at which those dollars are increasing from what they have been.
That is not a meaningless measure, though its value can be debated. You always pay some premium for the future revenue to be generated by a company, and if you can get that future revenue today at a premium below the rate of growth, it’s worth considering.
I call such a measure the relative price of growth, and my point in the article is to compare that valuation measure for Snowflake to other tech names such as Coupa, Atlassian, ServiceNow and The Trade Desk. Based on the price-to-sales-growth multiple, you pay more for all of those stocks than you do for Snowflake, even if their multiple of enterprise value to sales is lower than Snowflake’s, because none of them has the projected growth of Snowflake, not even close.
What results from that exercise is that a universe of over five hundred U.S.-listed shares is boiled down to twenty-eight stocks with a nice combination of growth and valuation in their favor. To return to the analogy with a stereo mixer, you can throw in the multiple of earnings, for example, divided by earnings growth rate — the “PEG” — or the dividend yield, in order to dial up greater emphasis on certain aspects of profit and income, etc.
Based on the mix of factors, Snowflake might come out less desirable. What is important is thinking about the portfolio as balancing certain desirable aspects.
An interesting question is how this search differs from the TL20. The most important element, in my mind, and it separates stock-picking from stock screening, is that every single stock in the TL20 was chosen first and foremost because it is a stock of a company with a track record of achievement, and with what I regard as tremendous potential based on my own decades of looking at companies.
The names in the Barron’s Advisor screen emerge from what you might call a purely quantitative process, letting the “data speak,” to use an over-used phrase.
That’s not meant to be a judgement, but the two approaches are different. The TL20 involves a bit of pattern-matching that’s a little hard to capture in a screen, while the exercise in valuation weighting is more mechanical.
Like an aircraft carrier: Procore’s bet with investors on a $14 trillion opportunity
Sep 15, 2022
Everyone knows that the investment climate this year has gotten a bit chilly for anyone not already profitable in technology. Or, at least, that’s the common conception these days.
One of the better performers in software is not profitable at the moment, but still maintains a fairly generous stock multiple of ten times next year’s projected sales, a multiple that the conventional wisdom would say should only be accorded to stocks of companies that are profitable.
“The herd mentality of Wall Street has definitely shifted to free cash flow at all costs,” says Craig “Tooey” Courtemanche, CEO of Procore Technologies, which sells software to streamline the challenges of the construction industry.
Procore, which came public in May of last year, has yet to report a profit, though it reported a small amount of free cash flow in 2020. Analysts expect Procore to lose money through 2024.
What is exciting is revenue growth, currently estimated by the Street at thirty-five percent this year, twenty-three percent next year, and twenty-two percent in 2024, at which point annual revenue will cross over a billion dollars from seven hundred million this year.
As they say, you have to spend money to make money.
“This is the bet we have with the investment community: the opportunity in a $14 trillion industry is so great we should be investing — with an eye to profitability,” says Courtemanche, in a conversation we had on Zoom following the company’s second-quarter report last month.
The fourteen trillion-dollar opportunity to which Courtemanche refers is the global construction industry, including not just McMansions but, more important, warehouses, distribution centers, data centers, bridges, hospitals — things that, Courtemanche likes to point out, tend to continue to get built even during recessions.
“I had a media person yesterday ask me about what I think of this niche industry we’re serving,” he recalls. “And I reminded them that it’s a ten to fourteen trillion-dollar industry — I don’t know why you would call it a niche!”
The company’s five-hour quarterly business review meeting, right before our chat, was upbeat, Courtemanche tells me, which is interesting at a time when lots of the software world is warning of slowing deal closings, and when companies such as Twilio are starting to lay off personnel.
Last month, with its earnings report, Procore raised its year revenue outlook for the second time this year. “We just feel very confident,” he says, “we actually saw a lot of expansion across our entire business.”
“There are lots of new contractors coming on board, our international business is growing, and existing customers are spending more with us as they increase their volume of business.”
“Customers are very optimistic.”
One thing that’s got them so optimistic is the Inflation Reduction Actsigned into law last month. “I was talking to one customer who, I asked how would they benefit,” he relates. “He said we just got ten years of guaranteed backlog for our solar and wind division — he was like, Dude, I’ve never in my career had ten years of committed business!”
Until it gets revised, the cynic will say.
“Yes, well, let’s see how this all ends,” says Courtemanche, “but if that doesn’t work, he could go build a chip fab,” an allusion to Intel’s planned twenty billion dollars of spending on new chip factories in coming years. The customer he was talking to, Courtemanche says, told him that he had, in fact, been asked to contract on a billion-dollar chip fab in Portland, Oregon, and he said no, “because he didn’t have the people to staff it.”
“We have gone from a year of abundance — people were paid not to go to work — to a new era of scarcity,” is how Courtemanche sums it up. Industry is running hot, labor is short, and the Street still doesn’t quite get it.
When we first talked, in May, Courtemanche told me the Street doesn’t get it about just how resilient construction is. Things like hospitals, and even multi-billion chip fabs, continue to get built even thorough recessionary cycles.
Has he made traction convincing investors? I ask.
“It certainly seems like it, the progressions of questions and concerns, they seem to be stabilized more than ever,” he says of investors. It is true, he says, that “the difference in perspective of what a Wall Street investor sees the world as, and what our customers see the world as, there’s just a stark difference,” he reflects. And yet, “It seems the bargain between the two,” growth and profit, “is very well understood” by investors now, Courtemanche tells me.
“We have to continue to show progress to cash flow breakeven, that’s the commitment,” he adds. And Procore has, he says, shown such a commitment by steadily improving its profit margin.
The non-GAAP operating loss this year, projected at thirteen to fourteen percent of revenue, is two percentage points better than the company expected at the start of the year. “We’re being very intentional about every dollar that we spend,” he notes.
The gross profit margin is the typical high software type, at eighty-three percent.
Because his customers have mammoth demand, Courtemanche has been making plans for the next major initiative to make money aiding subcontractors, a “fintech” initiative, if you will.
Procore is gathering data and running models to ramp up a financing and insurance service for subcontractors to relieve them of a chunk of up-front expense, and provide competitive deal insurance.
“They are little, tiny business inside of Procore that we’re incubating,” he says of the initiatives, at this stage by investing small amounts of capital.
Subcontractors are “really cash-strapped,” notes Courtemanche. “Financing things off your balance sheet is a really bad business model,” especially for smaller subcontractors.
Procore has “some milestones to hit internally,” he says, to figure out what is the right pricing, what is the right product for financing, and, of particular importance, when is the right moment to approach contractors when they are “at that right time of need” to be open to a financing proposal.
Ultimately, the company intends to find a capital provider to do the financing, he says. “We’re not going to do this off our balance sheet at that point,” he says.
“When you see us announce that we’ve formed that relationship, you’ll know we’re moving into a real thing.” The key, he says, is for Procore to be the brains providing the risk model to the financier.
“The difference in perspective of what a Wall Street investor sees the world as, and what our customers see the world as, there’s just a stark difference,” says Courtemanche. Investors still obsess over recession, but his construction industry customers “are very optimistic.”
“If we were a bank or insurance company, what you do is create risk profiles and test your loss ratios to the amount of money you put out,” he says. The perfection of those loss ratios is one of the main things being assembled now, based on the years of data gathered by the Procore software. Procore has unique scope on its industry in that respect.
“We have thirteen thousand customers, so we have a lot of data to test against,” he says, “which gives us a competitive advantage over anybody else because they don’t have that.”
Some of the risk with small subcontractors is not what you might think. If it’s an Amazon AWS data center being built, for example, “we would say Amazon is the underwriter for that, not the subcontractor.” And yet, the subcontractor can still get left holding the bag, and still could use some financial support, he argues.
How big is the market? The U.S. builds a pile of four hundred billion dollars a year in materials on the collective balance sheet, he notes. “The folks that we are talking to that would be a capital provider are very large, well-known institutions,” he says. They will be examining Procore’s track record too, he says. “I think they feel confident we’ll be able to figure this out,” given the experience and the data baked into Procore’s software.
Despite the constructive tone of everything in Procore’s customer and industry outlook, is there a Plan B, I ask, should things go really south with the economy?
“We always have an internal Plan B,” he says, although, “There’s no long-term Plan B like if all hell breaks loose,” he adds. “But I will say that with the external factors that we’re seeing like the macro [macroeconomic view], even though our customers are saying they’re cautiously optimistic about what’s happening, we take that into account.”
Rather than a sharp change, there is room, if necessary, for “adjustments to the business model,” such as delaying new hiring, he explains.
“That’s the beauty of this market,” says Courtemanche. “This market is like an aircraft carrier, it’s not like a speedboat: it goes at a particular pace, takes forever to turn.” In fact, five miles to turn an actual aircraft carrier, Courtemanche tells me.
“And, so, we have this early warning radar of what we need to do,” he says, “you get a good idea of where the market is going.”
That radar means Procore can “make these long-term investments knowing the digital transformation in construction is going to continue if not accelerate,” he says.
“We make minor adjustments around the edges, and go with the flow.”
Procore’s stock, at a recent $57.53, has been one of the better performers, down thirty-four percent this year, but up thirty-two percent in the past ninety days, versus the Nasdaq Composite’s decline of twenty-five percent and gain of eight percent in those respective periods.
Procore will hold its first analyst day meeting, the day-long series of management presentations to sell-side analysts, on November 9th.
In the TL20, as in the broader market, people sold their losing names on Tuesday
Sep 14, 2022
The broad selloff Tuesday certainly did not spare the twenty stocks in the Technology Letter 20. And the most interesting thing is that the declines were spread fairly evenly amongst the group. The selling did not distinguish relatively more expensive or less expensive names.
The TL20 declined by 5.9% on Tuesday, worse than the 5.16% of the Nasdaq Composite Index, although not as bad as some really pricey vehicles such as the ARK Innovation ETF (ticker “ARKK”), which collapsed by almost seven percent.
As you can see in the table below, the worse declines, lead by Nividia’s near ten-percent decline, were spread across a spectrum of valuation, based on enterprise value as a multiple of projected sales. Nvidia is still one of the most expensive chip stocks around, but Block, which is trading at a serious discount of less than two times projected sales, was also one of the biggest losers. In contrast, Tesla, one of the most expensive stocks in the group, at just under nine times projected sales, held up better than the Nasdaq, dropping only four percent.
Note: Coherent originally appeared as II-VI; the company changed its name on August 8th with the consummation of its acquisition of laser maker Coherent.
And, curiously, the decline Tuesday did not punish the most expensive stock in the group, a true outlier, Snowflake, which is now even more expensive than the already expensive multiple it had back in July, even though estimates for Snowflake have risen since then.
In fact, the most obvious pattern Tuesday is that people were selling their losing stocks more than they were selling their winners. I’ve sorted the table by the year-to-date decline. The top ten worst decliners for the year, including Block and Nvidia, down forty-eight percent, on average, had an average decline on Tuesday of 7.2%.
That is much worse than the 4.6% decline of the ten names in the TL20 that have held up the best this year, including ASML and Check Point Software.
The ranking is the same if you roll back to Monday’s close. The stocks that were down the most for the year through Monday’s close are the ones that declined the most on Tuesday, with the exception of Snowflake, which actually held up better than you would have thought.
The fact that people were selling their biggest losers Tuesday suggests we are not yet done with massive sell-offs. The notion of “capitulation” holds that the market has only really hit bottom when people sell their winners. They haven’t done that yet, by and large.
As for the TL20, I stand by the value argument. Many of these stocks were trading at multi-year lows in July when I surveyed them. Several are cheaper now, as indicated by the red boxes. Eventually, the value in those names will win out.
Oracle’s growth picks up: The Larry and Safra show wows the Street
Sep 13, 2022
The earnings season began again this evening, with Oracle being the first company to turn in results thanks to the fact that it closes the books really quickly. The company finished the quarter August 31st and was already ready to report less than two weeks later.
In fact, Oracle’s CEO, and de-facto CFO, Safra Catz, told the Street that the ability to close the books is proof of how good the company’s software is.
“Now, today's the 12th of September,” Catz observed, during the company’s conference call with analysts Monday night.
“In fact, I signed off with our auditors on Friday,” she said, “but we don't do our earnings on Friday, so we had to wait all the way till Monday.
“Now, no other companies report on the ninth or the eighth day, in fact, most companies were reporting their July quarter last week, and here we are announcing an August quarter.”
What does that have to do with anything? Catz told analysts Oracle’s software is the same stuff that Oracle uses to run its financials. If Oracle can turn the books faster, it must be great stuff, she said.
That kind of puffery, from a CFO, is a fairly routine these days in tech-land. But even by today’s standards, Monday’s conference call was an enormously feel-good affair. Catz, and Larry Ellison, co-founder, chairman and CTO, took turns trying to outdo one another with descriptions of how great the business is doing.
The reported results slightly beat expectations, and the forecast for this quarter missed for the first time since December. But that mattered little. Oracle’s results are held back by the rising U.S. dollar, which has risen over twelve percent this year against the Euro.
Foreign exchange is something Oracle can’t control, so the Street tends to put that off to the side.
More important, Oracle, is, indeed, showing some increased growth rates, and that’s news.
The quarter’s highlight was the company’s “organic” revenue, meaning, revenue excluding the contribution of healthcare giant Cerner, the acquisition of which Oracle closed on in the quarter, rising eight percent, year over year. That was the highest rate of quarterly revenue growth in years.
Catz cited a bunch of figures for the company’s cloud computing offerings. Because they are tossed out there in somewhat haphazard fashion, it’s hard to make sense of it all. But the gist of it is that there is some momentum in Oracle’s cloud computing services, which compete with Amazon, Microsoft and Google.
As Catz summed it all up, “It's not only that our growth rates are higher than our hyperscale competitors — maybe you’d expect that because we're the newest, and thus, the smallest — but our growth rates are increasing as we get bigger.”
Fair enough. The hyperbole, however, from Ellison, while triumphal as usual, contained an odd note of contrition. Ellison usually boasts about how much better his technology is than that of his competitors.
He did a little of that Monday evening, but then, he pivoted. He talked about how Oracle software can now be used from within Amazon and Microsoft cloud computing services.
“Multi-cloud interoperability is an important step in the evolution of cloud computing,” said Ellison. It’s slightly odd for Ellison to talk about people wanting to use other companies’ products or services.
The contrition didn’t last long. Ellison noted some new customers for Oracle in cloud computing, such as Nvidia. He also remarked as how he has been “personally talking to some of Amazon's most famous brands that are running at AWS.”
“The amount of money these huge companies, these very famous companies, spend with Amazon is kind of staggering,” he observed. Said Ellision, “they can save a huge amount of money by moving to OCI,” Oracle’s competing cloud service.
Ellison teased that “next quarter, we'll be announcing some brands, some companies moving off of Amazon to OCI that will shock you. I'll stop there.” That is vintage Ellison puffery.
The analysts on the call sounded fairly impressed. The shares rose mildly in late trading.
Among the intriguing details of the evening, other than the revenue increase, is that Oracle is spending like crazy to build its cloud computing facilities. The company’s capital expenditures in the quarter rose by sixty-two percent, year over year, to $1.7 billion. That was quite a bit faster than the forty-four percent rate by which capital spending had risen in the same quarter a year earlier.
Ellison explained that Oracle has “more data centers in more countries, in more cities than Amazon or AWS,” adding, “We’re expanding, because the demand is there.”
If there was one thing that wasn’t exactly clear this evening, it was the trajectory of Cerner under Oracle’s control. As I noted back in May, the twenty-nine-billion-dollar acquisition of the health care software firm has yet to prove itself a win. There was a little bit of discussion on the call, but not much, even though the deal was finally consummated last quarter.
Catz noted, encouragingly, that Oracle’s gross profit dollars rose by fifteen percent with the contribution of Cerner in the quarter, whereas the profit would have only increased by seven percent without Cerner. “In fact, the gross profit margin increased dramatically in the quarter,” she noted, although Oracle doesn’t report its actual gross profit margin.
One quarter does not make a trend, but this was a strong quarter for Oracle. My question, going forward, is what to make of a now heavily encumbered balance sheet.
This is the first quarter in which Oracle’s working capital, the difference between current assets and current liabilities, turned negative. Oracle, for the first time since I can remember, has less in available money to cover expenses than the total amount it owes in near-term obligations.
The long-term debt balance, moreover, has swelled to seventy-five billion dollars. And the company carries an astonishing sixty-one billion dollars in goodwill on its balance sheet.
Oracle is growing as it has not in several years, but it is not a lean, mean machine, it’s a big pile of IOUs.
The only reason that fact may matter to investors in the immediate term is the that, as I mentioned back in March, the company’s stock buybacks have slowed dramatically. Buybacks totaled just over half a billion dollars last quarter, versus eight billion dollars in the same quarter a year earlier. That continues a trend of several quarters slowing down.
With the stock one of the better performers this year, down only twelve percent, and the dividend yield one of the better ones available, 1.7%, and with growth now at a faster clip, perhaps none of that will matter.
GitLab CFO: We sell essential things for no more than the price of a Netflix subscription
Sep 08, 2022
Eight-year-old software maker GitLab is the best-prepared company to handle the quarterly earnings call with Street analysts, according to none other than the company’s chief financial officer, Brian Robins.
“We do the most work before we go into these calls,” says Robins. “I listen to all the transcripts of all the other CEOs and CFOs, I read all the analyst reports, I talk to public company CFOs, I talk to board members on public company boards.”
I was talking Wednesday with Robins via Zoom following a highly successful conference call the night before for GitLab’s fiscal second-quarter earnings report. The report, and the call with the Street, sent GitLab shares up fourteen percent Wednesday, capping a forty percent rise in the past six months.
Because Robins reads everything and talks to his peers, he knew going into last night’s affair that what was on everyone’s mind was how the macroeconomic outlook is causing a lot of companies to revise down their forecasts.
“These are uncharted markets, if you will,” Robins observes of the mood afoot. Companies of varying sizes are talking about how they are seeing deals to sell software come under greater scrutiny. The common refrain is, “We saw sales cycles lengthen.”
Not so GitLab. During the quarter, Robins and CEO Sytse “Sid” Sijbrandij told the Street Tuesday, “Buying cycles have actually sped up across the business.” GitLab’s sales rose a remarkable seventy-four percent last quarter, year over year, to $101 million, the fourth quarter in a row since the company’s IPO in October last year that results have easily topped expectations. The company added the most number of new customers of any quarter so far.
Despite beating the Street consensus for revenue by seven million dollars, Robins told analysts the outlook for this year will be higher than previously expected by about twelve million, the second time the company has raised its year outlook, cementing the sense that the company is defying gravity.
And so, I asked Robins, as had the Street, What is going to right for the company?
“I think being a mission-critical platform,” says Robins.
GitLab is a set of software capabilities that are woven together that serve application developers. On the simplest level, it is what is called a a “version control system,” a repository where coders put their pieces of code, a system that maintains the record of which is the latest version, which are stable pieces of code, which are experimental, and keeps track of who checked in or out stuff the way a library keeps track of borrowers. It’s the kind of thing that is an essential tool for programmers.
In addition, the software has taken on other capabilities, such as measuring how programs perform once they go live, and automating the process of finding bugs in masses of software too convoluted to be checked by hand.
The product has the homespun appeal of having begun life a grass-roots project. The first version of the software was created when co-founder Dmitriy Zaporozhets, “created GitLab from his house in Ukraine,” as the IPO prospectus recalls, “a house without running water.” Later, a member of the developer community, Kamil Trzciński, convinced Sijbrandij and Zaporozhets to expand the software, a pivotal, game-changing move.
Today, GitLab’s suite of tools is a “platform,” in software industry argot, an array of functions under the “DevOps” banner that not only manages code but checks for security breaches, compliance with rules. and other things that fall under a company’s software operations.
Competitors include DevOps vendors such as Datadog, but most of the competition, insists Robins, comes from companies that try, and fail, to build something in-house to manage their programming tasks.
As a result, he tells me, “Companies are saying, I can save money, I can be more productive, I can hire fewer engineers if I deploy GitLab,” and that makes the program indispensable.
Robins is fond of citing “cohorts,” the customers who have come to GitLab in waves and who end up staying and spending more. The company’s dollar-based net retention, or “DBNR,” one of The Metrics that are paramount in the software world, has consistently been over one hundred and thirty percent. That is a measure of how much more customers buy than they did a year ago. That is a very healthy rate of expansion.
“Our cohorts are still expanding with us — six, five, three, two years ago — they are all still expanding,” he says. “That doesn’t happen unless you’re fairly priced, you offer a good business outcome, and your time to value is quick.”
I offer to Robins to play devil’s advocate. Is it possible, I ask, that a company that is at half a billion in revenue annually, that sells a useful product at a modest price — the entry level for the software is $19 per developer per month — is the kind of thing that is not a big enough deal to attract the attention of the CFO when she or he, in times of trial, is looking around at things to scrutinize for extra cost savings.
“What I love about the model, from a CFO perspective, I have such great predictability into our revenue,” says Robins, referring to the subscription nature of Gitlab software, “if I see any degradation in the pipeline, I can adjust,” he says, meaning, the deal pipeline.
“I love it, because I think the same way, it’s the right way to think,” says Robins. He recalls talking with a GitLab investor Wednesday morning. “The investor said, the Premium [version of GitLab] is the price of a Netflix subscription — you offer so much more to a developer for $19 a month.”
There are only two tiers of product. The “Ultimate” version rises in price to $99 per developer per month. “For less than twelve hundred a year, you’re getting the most advanced and comprehensive security features as well,” he notes. Companies spend as much as fifteen thousand dollars a year to equip a junior sales rep with tools for sales and marketing, so in the scheme of things, GitLab seems a rounding error.
The product has a “a super-simple” pricing model, he notes. It is the same cost whether companies run GitLab in their data center, or in a public cloud facility such as Amazon AWS. The company has “removed all the friction” in buying because it knows companies will over time add more and more subscriptions for their developers as the product spreads inside teams, building that DBNR.
Add in the fact that the software makes developers more productive. T-Mobile, a GitLab customer, went from a several-month release cycle for new applications to several weeks, Robins tells me. Goldman Sachs went from weeks to hours. The market research firms such as Forrester Research estimate the GitLab software has a return on investment of over four hundred percent, on average.
But, GitLab is still very early in its mission. As Sijbrandij is fond of telling the Street, GitLab has yet to crack the vast majority of developers in its target customers.
So, I ask, when does GitLab rise to a level of a customer’s spend where it does attract the increased scrutiny of the CFO?
“It’s a great question,” says Robins. Customers have signed huge checks for GitLab at times, he observes, ten million dollars or more at a throw, even though those aren’t the norm. The biggest deal at one time was investment bank UBS signing a license for nine thousand users.
It comes down to the return on investment, says Robins.
“If someone came to me as the CFO of GitLab, and said, I need to invest $600,000, and I’m going to save you three million dollars, are you in for that,” reflects Robins, “I would say, If it’s the right thing for the business, yeah, I’ll approve that all day long.” Citibank, he notes, spends an estimated one billion dollars per year on security.
“If we’re at a ten, twelve-million-dollar clip with a bank, I don’t think they’re going to get far enough down the descending rank of expense of vendors at the CFO level to hit me.”
Because he observes his peers, and he listens to what the Street us saying, another thing that Robins has observed this year is that the market has changed from one that lauded growth at all costs to one that “demands you get profitable at all costs,” he quips.
GitLab is not yet profitable. He and Sijbrandij have told the Street that growth is the most important thing, for which they will continue to invest, but they will do so responsibly. The company hired over four hundred new employees in the first two quarters of this year.
Nevertheless, the company’s gross profit last quarter rose by a point to eighty-nine percent. And the company’s non-GAAP operating profit margin improved by fifteen points, to negative twenty-seven percent. Those numbers are testimony to increasing efficiency, what is referred to in the software world as improving “unit economics” of selling software.
“We will add $160 million of incremental revenue this year, at the mid-point of guidance, for less of a loss this year than last year,” he notes, regarding the revenue forecast. That is despite a return of some expenses that had been in abeyance such as travel.
The heavy lifting is mostly done, he insists. “We got the infrastructure built, and now almost all of what we’re adding is in sales for incremental capacity, in marketing for lead-gen, and in R&D for product feature functionality enhancements.”
Those new team members will take nine months to contribute to new sales as they come up to speed. If things go well economically in that time, the new recruits could be a big lift to capture sustained customer demand.
And if things don’t go so well economically?
“What I love about the model, from a CFO perspective, I have such great predictability into our revenue” because it is a subscription business, he tells me. “If I see any degradation in the pipeline, I can adjust,” he says, meaning, the deal pipeline.
To adjust, in this case, could mean putting a sudden hold on hiring. That “visibility” into things, and the ability to temper hiring, are “two levers” to adjust, as he sees it. “As the CFO of this company, I feel I’m in a very privileged spot, a unique situation, so I can manage the profitability of the business, if something were to go negative.”
Robins, 51, is a very seasoned executive and a very seasoned CFO. He has had tenures as a consultant to hedge funds and venture capital funds, and also multiple CFO stints with large companies, including digital authentication vendor Verisign, and consulting giant Computer Sciences Corp. He has seen the shape of companies over years that were much larger than GitLab. GitLab is more like turning a speed boat than a tanker.
“If I were a manufacturer, I would have to put all the PP&E [property, plant and equipment] in place, and then hope I can get the manufacturing ramped up, and then sell the crap out of it.”
Given the improving unit economics, given the levers of control over growth, why not, I ask, give the Street some sense of just when GitLab intends to be profitable?
“Great question,” says Robins. “We’re working internally, we are doing what we said we will do, it’s in our long-term model, we are executing against that better than what we said at the IPO” in October of last year.
The company may have an “analyst day” meeting at some point, the confab where a vision is laid out, but that hasn’t been decided yet. “It’s when you want to tell something, you can’t just regurgitate what you said on earnings, you don’t have one just to have one.”
In the meantime, Robins spends shoe leather to canvass the investor community. “I talk to well over a hundred investors a quarter, I talk to every analyst who covers us, I get out on the road.”
The stock, at Wednesday’s close of $54.16, trades for about eleven and a half times the Street’s estimate for next year’s revenue, which prompts me to ask one of my favorite questions for CFOs, Is the stock a good buy?
“I only control what I can control, and that’s the execution of the business,” says Robins. “I’m super-happy that we reported seventy-four percent year-over-year growth, and we added the most base customers in company history in really, really, really tough macroeconomic conditions.”
GitLab stock is down thirty-seven percent this year, and fifty-seven percent since the IPO.
TL20 slips below Nasdaq for the first time
Sep 02, 2022
The carefree days of July are a distant memory with the decline of the TL20group of stocks to a mere one and a half percent increase since the inception date of July 15th. As you can see in the chart, Thursday was the first time the TL20 slipped below the return of the Nasdaq Composite and the S&P 500 since inception.
The proximate cause are the semiconductor companies. They are the worst performers aside from computer security vendor Check Point.
A further examination of the semiconductor names shows that most had solid earnings reports during the past month and a half, but their forecasts missed in many cases.
The signature example of what’s going on in the chip industry are the memory chips. The warning by Micron Technology, a maker of NAND and DRAM memory chips, on August 9th, was the proximate cause of weakness in the group. Micron said that it sees its customers revising their orders as they considered how they would sell off inventory of chips they already have. That lead to the conclusion that most chip companies are dealing with a sudden build-up of inventory.
This is the semiconductor cycle playing out, the bust that follows the pandemic-induced boom. All eyes are now focused on whether it will be a V-shaped recovery or something more protracted.
Robert Maire, long time semiconductor analyst, writes in his latest missive via email that “the bell ringer indicating the bottom of the cycle is the last bullish analyst capitulating (ignoring those who never change their ratings….).”
It certainly seems as if the last bullish analysts have capitulated. Revenue estimates for Micron’s fiscal year ending August of 2023 have been cut by twelve billion dollars this year. Cuts of multiple billions of dollars in estimates have also been made for many other semiconductor companies and tool makers, including Applied Materials and Lam Research.
Probably, the transition from the extraordinary tightness of the lockdown period of COVID-19 to whatever is the new normal for the chip industry will be unlike any other semiconductor cycle. Regardless of the shape of that transition, the key thing that I observe is the same thing I observed in mid-July when compiling the TL20. The chip stocks are trading cheaper than they have been at earlier times. Some of them, such as Applied Materials, are cheaper now than they were in July.
For key companies such as Applied and ASML and others, their valuations, as value stocks, will win out in the end.
Pure Storage, Nutanix shine in a night of wrecks
Sep 01, 2022
Yeeesh. It’s a brutal night for earnings reports, with multiple double-dip decliners, including some high-flying names that have been software darlings.
The good news tonight is that Pure Storage, the maker of a kind of NAND flash-based device for managing data, beat with its results and outlook and is up by six percent in late trading. Pure is one of the TL20stocks to consider, so I was glad to see that.
Another winner is Nutanix, whose shares are up nineteen percent this evening, a big rebound from the disastrous sell-off in May.
But first, let’s look at the bad news. C3.ai, Okta, MongoDB, and Veeva are all down by double digits.
Tom Siebel, founder and CEO of industrial AI company C3, said in his press release tonight, “the economic downturn is real.” He told analysts on the call, “Our customers and prospects appear to be expecting a recession,” and they are more and more scrutinizing deals for his software.
“In the course of the quarter, we saw sixty-six forecasted deals move out in the quarter, many of which we would have fully expected to close under normal market conditions,” said Siebel. He said the company is responding by making cuts to non-essential spending.
This is a continuation of Siebel’s tone back in June, when he told the Street things were becoming “quite dire.”
The side story here is that Siebel announced the company is moving from selling contracts with predictable revenue to selling on a “consumption” model. In a consumption structure, C3 doesn’t bill customers at a pre-ordained time, like the beginning of each quarter, but only as they use the software.
That is the approach that companies such as Snowflake and Confluent use, and Siebel said consumption pricing is becoming the standard in cloud software.
Said Siebel, it was necessary to move away from the company’s more traditional selling approach.
“While this elephant hunting subscription sales model has served us well in establishing C3 AI as a leader in enterprise AI, it is clear it is not well suited to the deliberate decision and approval processes inherent in the current economic environment.”
It’s an approach, however, that injects some uncertainty into C3’s model, because the rate of revenue recognition now falls into the hands of customers.
And consumption pricing won’t necessarily prevent economic effects. On tonight’s call with MongoDB, CEO Dev Ittycheria, told the Street that his company is seeing adverse effects despite still healthy demand for its database product.
Although “MongoDB is a non-discretionary spend for our customers,” Ittycheria said, “As expected, we did see the macro environment weigh on the growth of Atlas consumption.” Atlas is the name of the version of the company’s database that can be used by customers in public cloud computing facilities.
“It's important to understand that the slower-than-historical consumption growth is the result of slower usage growth of our customers' underlying applications due to macro conditions,” added Ittycheria. “In the current environment, some businesses, and, consequently, their applications, are growing more slowly.”
Over at Okta this evening, the reported revenue was better than expected, but the forecast for this quarter’s revenue was merely in line with consensus.
On the call, CEO Todd McKinnon told analysts that the company has seen “a discernible impact from the evolving macro environment.” Specifically, “We are starting to notice some tightening of IT budgets and lengthening sales cycles relative to last quarter.”
“This leads us to believe that the weakening economy is having some impact on our business,” said McKinnon.
Another company having a tough evening is Veeva, which makes cloud software for Big Pharma. The company missed with its revenue outlook for this quarter, and CFO Brent Bowman told analysts on the call that economic factors are affecting smaller customers and the drug advertising market.
“Specifically, it's impacting commercial a bit more,” said Bowman. “We've seen some impact to Crossix as advertising budgets have tightened a bit, and we also saw a little bit of lower add-on to users from SMB customers in the CRM and bulk commercial.” Crossix is the name for Veeva’s program that helps drug companies analyze their media spend.
The bright spots, again, were Pure and Nutanix. Pure’s results and revenue outlook both topped expectations, as they has for many quarters. In his prepared remarks, CEO Charlie Giancarlo spent most of his talk telling analysts how great demand is from customers. His only reference to trouble was the remark, “We do, however, see signs of increased diligence of purchases by enterprise customers, resulting in some lengthening of sales cycle.”
When pressed on the matter, Giancarlo added that the company is not losing any deals, but the CFOs at customers are weighing in more than usual:
While we are seeing you know a little bit of let's say second, second looks by companies, you know finance perhaps stepping in for a second look at a deal and that is lengthening some of the enterprise sales cycles, but it's not changed the closing of the deals later in the process. And as I said demand and pipeline looks very healthy.
In a note to clients this evening, Cowen & Co. analyst Krish Sankar writes of Pure, “A recession-resistant stock...so far!”
Finally, Nutanix, you’ll recall, had a terrible May report, falling twenty-three percent after it offered a disappointing quarterly forecast and cut its full-year outlook.
Tonight, the company’s outlook for both the fiscal first quarter, and the full fiscal year, are meaningfully higher than Street consensus.
CEO Rajiv Ramaswami told analysts he’s been traveling a lot, meeting with customers, saying he continues to see “solid demand” out there.
However, he also said the company is laying off four percent of its workforce, amounting to two hundred and seventy people. The move was presented as a way to be “diligent” about expenses in order to ensure profit going forward. Nutanix turned free-cash-flow positive in the fiscal year just ended, and intends to remain so going forward.
When Ramaswami was pressed on the issues of the economy, he remarked that a good part of the future revenue is in the bag, so to speak. That’s the portion that consists of “renewals,” contracts to use Nutanix software that are about the customer maintaining the product. There’s a tendency for customers to want to keep what they’ve already been using, so they renew.
However, he also noted, “it's in the new and expansion business that we have factored in some conservatism as it relates to the macro environment.” Meaning, Nutanix acknowledges it may be harder going forward to win new business, rather than renewals.
Despite the two bright spots, there’s something of a pattern here. It sounds like no one has a full grasp of what is going on with IT buyers. The happy possibility is that Nutanix and Pure may be selling products that are important enough that they are less likely to be cut off.
The question is whether things such as “increased deal scrutiny” at some point turns into customers heading for the exits.
HP Enterprise: holding pattern continues
Aug 31, 2022
The two Hewletts — Hewlett Packard Enterprise and HP Inc., the concoction of Meg Whitman when she split the business a decade ago — came up short on Tuesday evening, reporting revenue below expectations, and a lackluster forecast in the case of HP.
The one Hewlett is doing quite a bit better than the other, however. Enterprise, the part that sells networking and servers and builds supercomputers, is on track to come pretty close to goals set out a year ago. It has record backlog of orders to fill, which speaks to the health of its market.
The sour result at HP Inc., on the other hand, echoes the gloomy report from competitor Dell last week with its miss on quarterly results and miss on forecast. The PC market is going through its long unraveling, which is having a major negative effect on HP’s revenue. The company doesn’t forecast revenue, but its profit per share forecast for the current quarter, seventy-nine cents to eighty-nine cents, is more than twenty percent below the consensus for a dollar and six cents. That’s in large part because revenue won’t be as high as originally expected given the weak PC market.
Hewlett Packard Enterprise shares fell three percent in late trading, while HP Inc. shares fell six percent.
Both Hewletts are struggling with not being able to get enough parts to make some shipments, a continuation of the supply chain mess. The silver lining for HP Enterprise is that those orders for networking and servers keep coming in, so its appeal to customers is intact, unlike with HP Inc. in the PC market, where demand is falling apart.
With one quarter left to go in its fiscal year, HP Enterprise’s CFO, Tarek A. Robbiati, Tuesday evening told analysts the company is on track to achieve its forecast for three to four percent revenue growth. That is despite the fact that the rising U.S. dollar is imposing a penalty of over two percentage points on the company’s revenue.
The company’s operating profit margin, moreover, last quarter was in keeping with the company’s goal of operating margin at 10.5%. And although the company lowered its outlook for the year’s free cash flow, to a range of $1.7 billion to $1.9 billion, that’s not very far from the original goal of $1.8 billion to $2 billion this year.
When HP Enterprise’s CEO, Antonio Neri, was pressed by analyst Shannon Cross of Cross Research about the poor results at Dell last week, Neri, without addressing Dell, replied by telling Cross, “I would say this quarter, Shannon, was characterized in my mind by enduring customer demand.” He noted the company’s backlog of unfilled orders is three times what it usually is during this time of the year.
As Neri pointed out, filling those orders will be a continuing payoff for HP Enterprise, especially having locked in higher prices for some of its products:
In term of clearing the backlog, this is going to still take quite a bit of time and that's good news for us because it give us momentum in Q4, into 2023, which is great because remember two things have happened in our backlog. Number one is price for a strong gross margin as Tarek just went through. So, in many ways it's protected for that gross margin, and, number two, we have not seen any meaningful cancellation at all.
The one fly in the ointment for HP Enterprise is that the company has come through a period of booming orders, and the pace of those orders are coming down. That means that once the company catches up with all the product it has yet to ship, there may be less growth to be had in future quarters.
One analyst, Toni Sacconaghi of Bernstein, asked about that. Neri responded to him by stating that some areas of the business, such as “edge” computing, are even stronger in terms of order growth and aggregate demand than for the company overall. And, anyway, he told Sacconaghi, the company sees nothing to deter it from its stated intention to increase revenue at two percent to four percent, annually, over many years.
Added Neri, “I will use the word steady because obviously you can't use the word growth in the context of the compares here but steady, steady.
“And then within that steadiness, we have growth in some unique segments that continue.”
Steadiness sounds to me rather like stasis, which sounds like what I wrote of HP Enterprise a year ago at this time: A bit of a holding pattern.
What I wrote then was that it “looks like Hewlett is just back to where it was two years ago,” in 2019, in terms of profitability, with revenue growth made easier by pandemic buying.
It still feels that way now, like HP Enterprise has some real winners in terms of product, such as its GreenLake cloud service, but the company also still struggles to “move the needle,” as they say, with such winning products.
For now, I would be more interested in two of the TL20stocks to consider that are within HP Enterprise’s field of view, Arista Networks and Pure Storage. Both of those are cleaner stories, more focused vendors.
Most holders of HP Enterprise will be in it for the 48-cent per share per year dividend. At Tuesday’s price of $13.65, that’s a 3.5% yield. Not bad.
HP Enterprise shares are down twelve percent this year, only half the twenty-four percent decline of the Nasdaq Composite, and less than the S&P 500’s sixteen percent decline.
HP Inc. shares are down sixteen percent this year.
Calculating the TL20
Aug 29, 2022
The TL20group of stocks to consider is up over eight percent from its reference date of July 15th, besting its benchmarks by a significant margin despite recent market turmoil.
Or, perhaps, because of recent market turmoil. The TL20 were picked as being good deals, and I think that the virtue of picking good stock buys becomes readily apparent in tough markets.
I’ve cited the TL20 regularly, and the daily performance is shown at the top of the TL20 home page. Given that I’m throwing that number around a lot, it seems fit to talk about how the composite performance is computed, in the interest of being transparent about the group. That way, you can follow along, if you like, and also make your own record of gains and losses if you’re so inclined.
The TL20 performance number I cite is produced as an automatically generated composite number by FactSet, but you can easily do the math with pencil and paper if so inclined.
The TL20 is a market cap-weighted composite, which means that some of the twenty stocks count for more than others in calculating the change in price from the start date. That approach is common to very popular indices such as the Nasdaq Composite Index.
The procedure is as follows, and you can see it displayed in the columns of the table below. Every one of the twenty stocks, on the day the portfolio is started, which is July 15th, has a closing stock price as its start price. The number of shares outstanding for that stock at that time is multiplied by that starting stock price to arrive at a market capitalization for the stock. That market capitalization is then divided by the total market cap of all the stocks to arrive at what fraction of the total market cap the individual stock represents.
That fraction of total market cap constitutes each stock’s “weighting.” We then use that to “weight” the total, composite return of all twenty stocks.
First, we calculate the simple price appreciation — or depreciation! — since inception, on any given day, for each stock, by dividing the current stock price for each of the twenty by its starting price on July 15th. Then, that price percentage change is multiplied by the weighting. It’s like saying, If a stock has increased or decreased this much, we are only going to consider a fraction of that price change in the total.
The twenty individual percentage changes, thus modified by their respective weightings, are then summed to arrive at a final percentage change in aggregate. It is that final, summary change that counts as the collective performance of the TL20.
All that business means that we take only a portion of the price change of each stock into account when composing the total price change of the group, in proportion to how big that stock is compared to the total market capitalization of all twenty on the day we started. We’re saying, in effect, Changes in small stocks count for less than changes in big stocks.
You can see that in the table. Although Snowflake had a great week, jumping twenty-three percent on Friday, following a positive quarterly report Thursday night, Snowflake doesn’t count nearly as much as some others because its market cap, $47 billion, is much smaller than, say, Nvidia, worth $394 billion. The total price appreciation for Snowflake since July 15th, almost thirty-four percent, counts for only a little bit more than Nvidia’s three percent gain in that time once the two are weight-adjusted.
The market-weighting approach, while very common, can be disputed. For one thing, it gives giant companies such as Tesla and Nvidia a greater affect upon the overall performance of the group, the same thing that happens with giant companies such as Apple in the case of the Nasdaq. Over time, one can “re-weight” the group, as small companies become bigger, to make things less lopsided.
Another objection is that his form of weighting does not take into account many factors. It does not take into account the relative stock valuation, nor does it take into account dividends, which contribute to the total return of a stock over and above its mere price appreciation.
Also, as one astute reader has already pointed out, this simple market cap weighting doesn’t take into account factors such as risk adjustment, especially relative to benchmarks. It’s common to weight price appreciation by how risky a stock is on a relative basis.
On the plus side, this simple price calculation is, as I said, pretty easy to calculate without a lot of fuss. I think there’s a lot to be said for simplicity.
Keep in mind that the prices of benchmarks such as the ARK Innovation ETF, can have different forms of weighting, which may tend to make comparisons a bit less than apples to apples.
I hope you find all this as fascinating as I do, and I hope it helps to make the TL20 more transparent.
If you think there are better ways of keeping score, let me know!
Informatica CEO: The opportunity is ginormous
Aug 27, 2022
There may be no second acts in politics, but there are lots of them in technology and, sometimes, perhaps better than the first time around.
In 2015, Informatica, a software company with which I was very familiar at the time, was taken private by a private equity group that included Salesforce Ventures, Pereira Holdings, and Microsoft in a deal valued at about five billion dollars. It was one of those surprising times when a very important company simply steps aside.
There was an interesting process going on as Informatica stepped out of the limelight, namely, a transition to cloud computing.
“We are a startup that went from zero to a billion dollars in seven years,” says Amit Walia, who is a nine-year veteran of Informatica, and who had been the head of product before the buyout but is now the chief executive officer.
Walia and I were talking recently via Zoom, following Informatica’s successful second-quarter earnings report on July 28th, in which the company’s revenue topped consensus by five percent, the third quarter in a row of outperformance since the company’s return in October in an initial public offering lead by Goldman Sachs.
Informatica is not exactly a startup, it is a company worth seven billion dollars that is heading toward a billion and a half dollars in revenue this year.
But Walia is fond of pointing out what is entirely new about the company, namely a billion dollars of this year’s business, two thirds, is the subscription kind, from new products developed after the buyout, for new and expanding use cases. Within that, some subscription-based software runs in a company’s “private cloud,” while a good chunk, the fastest-growing part, increasingly runs in a publicly-hosted cloud computing operation run by Informatica as a service — new lines of business that didn’t exist for Informatica in 2015.
My question for Walia was the question that comes to mind immediately from having known a company and seeing it come back around: What’s different now?
Informatica, which was founded almost thirty years ago, had been one of the biggest vendors of a technology called “ETL,” an acronym for “extract, transform, and load.” ETL was one of those obscure things that chief information officers spend a lot of time on but that ordinary people never hear about.
When you’re constructing a database, such as, for example, the sales analysis database, to study how your different sales regions do, you have to separate out the customer records, the product records, the transaction records, and put them in a form where they can be sliced and diced by analysts.
That requires dealing with all the scary details of production databases, such as incompatible formats and unreconciled time series and such. For many years, Informatica had a good business selling the ETL tools to let CIOs solve such headaches for giant data sets, for departments, for the C-suite, for the conversion from one database program to another, for the staging of copies of data in different repositories in different facilities.
Informatica was synonymous with building an infrastructure for data, at a time when companies wouldn’t put stuff in the cloud but guarded it in their own data centers.
Nowadays, of course, people increasingly dump data in the cloud, and newer companies such as Snowflake and Confluent promise to let their customers sort and sift a lot of stuff without assembling things as much, just dump it all there.
In what seemed a sign of the times, another prominent public ETL company, Talend, last year was taken private by Thoma Bravo. It was as if the M&A market was saying, the world no longer needs ETL stuff.
What use would there be, I wondered, in a Snowflake world for a company that had been pre-cloud? How central can Informatica be in a new era?
“Great question,” says Walia. “Because we existed before, the question becomes, That problem existed then, does it exist now?”
His answer is a rhetorical question. “If a company didn’t exist in 2015, and they got VC funding, and today they have a billion dollars in revenue, what would you say?”
All that matters in business, Walia observes, is whether there is a business need, and whether a given company has the products to address it.
That is very much the case for Informatica, says Walia. “Our billion-dollar subscription business was created on the coat-tails of all-new products, and the new problems the world is facing,” he says. “We are not an ETL company any more,” says Walia. Where once it was a hundred percent of its business, the ETL product is only a quarter of sales today.
The billion dollars to which he refers is the component of the company’s annual revenue that is now from subscription-based sales, including cloud computing programs, all of which didn’t exist in 2015 when the company sold what are called perpetual licenses for use strictly in a company’s data center.
“We are neither Snowflake nor Confluent,” says Walia, given that Informatica is already profitable unlike those companies. “We know how to run a proper P&L of scale.”
What Walia and team refer to as “the intelligent data management cloud,” or “IDMC,” is a suite of products that include ETL, but also newer tools to integrate multiple third-party applications; tools to run compliance, privacy and “governance” rules against a company’s data; and tools to define a master record of data amidst multiple copies, to name a few of the things the company wasn’t used for that it has spent seven years building.
“We spent a billion dollars in R&D over the last five years” to make the new programs, he notes. “We are the leader in all four Magic Quadrants — not a leader, the leader,” he points out, referring to the branding that research firm Gartner uses to anoint category leaders in technology.
The problems the world is facing, says Walia, are, in fact, the standard business problems of information now exacerbated by the fragmentation created by putting all kinds of data into cloud computing. Stuff is simply everywhere, and so rather than being a solution, cloud is a bit of the problem at the moment.
“The world is a lot more hybrid, and a lot more fragmented,” than it was before companies started moving to the cloud, he observes. “There is no one system of record, they are using Azure for this [Microsoft’s cloud computing service], Snowflake for that, Data Bricks [a data management startup] for this, GCP [Google’s Cloud] for that — the competition for that is between Azure and Snowflake and Data Bricks and Google, not us.”
In fact, Informatica partners with those various companies to advance its mission. “Snowflake is duking it out with Data Bricks every day, they partner with us.”
Walia’s point is that the things that companies want to do, their problems, span these various sets of cloud technologies, at a higher level.
“If I am Unilever, and I want to have a single view of a supplier, called ‘Supplier 360,’ that has nothing to do with Azure,” says Walia, “that’s a business problem.”
Walia is, in fact, clearly fired up about the many gigantic customers that use his IDMC and have been using it for years for an increasing array of things.
“If I’m Lufthansa, and I’m running a customer program, that has nothing to do with Snowflake, that has nothing to do with Data Bricks, that’s a business problem.”
“If I’m the CDO [chief data officer] of MasterCard, and I want to have governance and compliance across my enterprise, that has nothing to do with one data warehouse, that’s a business problem.”
“The cataloguing, governance, master data management —those use cases have ginormously scaled as we’ve grown the company,” he says.
“I just spoke to the CEO of a top-three bank in the country,” Walia tells me. “They have a deal with Azure, they have a deal with Amazon, have a deal with Snowflake, they have a deal with Data Bricks, and they came to me and said, help me manage this complexity.”
The increasingly complex problems of his customers mean that the architecture, the choice of how repositories of data fit together, has to be considered for different purposes. Information is never simply dumped in one big pile.
“If I’m a bank, you use your ATM, you update your balance on your app, but you don’t dump all your data in that [data] warehouse in one instance ,” he says. “The use cases have different types.”
The virtue of Informatica is to have the breadth, with its IDMC, to span those many considerations, he says. “We serve all of them, not just one.”
Given the breadth, “you see how we come into play,” Walia states. “There is a reason why all of Confluent is less than my cloud business, which is half a billion today growing at forty percent,” he observes.
Confluent, the real-time streaming cloud company, founded a year before Informatica’s buyout, has trailing twelve-month revenue of $488 million. Walia told the Street in July that Informatica’s cloud computing “ARR,” its annualized recurring revenue, the total value of contracts signed for cloud, out twelve months in time, will reach $438 million to $448 million by the end of this year.
“Half a billion dollars growing at forty percent — that’s a pretty fast-growing startup,” he remarks.
Again, though, Informatica is not exactly a startup. The company had a billion dollars in sales in 2015 when it was taken out. It is not as if Informatica was a pole vaulter who stood by the bar and jumped over from a standing start.
Walia’s point, however, is that all of the subscription business, the vast majority of revenue, from all-new products, has nothing to do with the legacy sales of what is called “maintenance” in the software business, the license to continue running a given software program.
How much of the new billion dollars is existing workloads moving over to the cloud? “Zero!” says Walia. “There has been zero conversion, the billion dollars of subscription is all net-new.”
The traditional maintenance license was to run an Oracle data warehouse in a company’s own data center. That business is still around, about forty percent of revenue in any given quarter. But it exists at the same time that new stuff is happening that has nothing to do with the old Oracle data warehouse, says Walia.
“Subscription means, I’m starting a new workload on [Amazon] Redshift,” Amazon’s data warehouse program running in AWS. “I’m starting a new workload on Snowflake — those are completely different things.”
The process of moving the old stuff, the forty percent of revenue that is maintenance, is underway, and proceeding slowly, over years. “My maintenance did not go away, less than two percent of my maintenance [revenue] has migrated to the cloud,” he notes. “We started that process last year, it’s a very complex process,” a process that touches on some of the most sensitive data, he notes. “Some of our customers are running their 10-K, 10-Q reporting in an on-premise data warehouse using Informatica.”
That suggests a nice prospect for Walia and team. Customers keep paying him to keep the old stuff running in their data center, while they buy the new stuff at a faster clip, and they still may replace the old stuff with even more new stuff at some point.
“The beauty is, for every dollar of maintenance, I’m getting two dollars of cloud ARR,” says Walia. “Because that’s the point where we have the ability to take our cloud platform and also cross-sell and up-sell to a bigger use case.”
When will all of that legacy maintenance revenue convert one hundred percent to cloud? “That is a billion dollar question,” he says. “These are mission-critical workloads, you have to go slowly, because you can’t break what works for the customer.”
The smart money, says Walia, is banking on this ginormous opportunity, including a lot of returning Informatica institutional investors,
“A lot of our investors, all the blue-chip mutual funds, Fidelity, Wellington, and Franklin, and T. Rowe — they obviously helped us back in the days, and they’re all back in because the opportunity is a lot bigger.”
Moreover, Walia is in the unique position of having both tons of money from the October IPO — a billion-dollar capital raise — and also real cash profits. “Last year, at the IPO, I had to apologize for making money, it’s back in fashion now, as you can see.”
Informatica’s free cash flow was two hundred million in the past twelve months, a thirteen percent margin, and a yield of about three percent based on a recent stock price of $22.67.
Some of the cash will be used to pay down a remaining balance of pre-IPO debt of $1.8 billion, this year and next year, he says, with a goal to getting below six times leverage, meaning debt divided by Ebitda. (Currently, the measure is about six and a half times.)
But there is still a priority on R&D spend, running about nineteen percent of sales. That percentage is below the spending rate of Snowflake and others, but it is not stingy, Walia insists. “We haven’t skimped on investing, we would not have been able to build all this stuff without investing in R&D,” he says. “We will continue to do that.”
More important than the absolute rate of spend, to Walia, is the fact the company has made that transition from selling a traditional license to software to selling subscriptions, and especially cloud. That “model transition,” as it’s known on the Street, is often a time of trial for software companies, when the financials become messy and timing of revenue becomes less certain. Informatica is through the eye of the storm and in safe harbor in that respect.
“We went from a hundred-percent-licensed company to a hundred-percent-subscription company, our gross margins have barely budged because we know how to run a good P&L,” he says. In fact, gross profit margin this year, projected at about eighty-one percent, is a big jump up from seventy-five percent two years ago.
As we wrap up, I point out to Walia that the stock is hardly expensive in the realm of software valuations, especially for a company already profitable, trading at less than five times enterprise value as a multiple of next year’s projected revenue of $1.74 billion. That is just a little bit higher than the take-out multiple in 2015 of just under four times.
Is the stock too cheap?
“Absolutely, we are very cheap given the growth we have on the top line,” says Walia, adding, to cement the picture, “subscription [ARR] growing thirty percent-plus, cloud [ARR] growing forty percent-plus, eighty percent gross margins, great cash flow.
“The market will correct itself,” he says of the valuation. “You never look at the day and the week and the month, you keep building, and you keep executing.”
Snowflake surges: ‘cRPO’ gives a warm good feeling
Aug 25, 2022
Nice night for TL20 pick Snowflake, which reported fiscal second quarter revenue, for the three months ended in July, more than six percent higher than Street expectations, growing at a very smart eighty-five percent; and forecast its revenue for its products, excluding its professional services, this quarter to be in line with consensus.
The stock soared this evening almost eighteen percent in late trading, a nice reversal of the big sell-off in May.
That six percent beat was the highest since the year-ago report. More important, it was a big sigh of relief. Snowflake, you’ll recall, sells on what is called a “consumption” model, meaning, it bills customers not at a pre-ordained time, like the beginning of each quarter, but only as they use the software. That means revenue from cloud has an unpredictable element.
As I mentioned in my interview with Confluent CEO Jay Kreps the other day, there has been a concern about how well consumption would hold up amidst worries about corporate belt tightening, whether people would temper their use to slow expenses.
Not the case, as it turns out. In fact, Snowflake's CFO, Michael Scarpelli, told analysts that out of the total value remaining in signed contracts, what’s called “remaining performance obligation,” or RPO, the “current” portion, “cRPO,” meaning, the amount the company expects its customers to realize over the coming twelve-month period, is fifty-seven percent.
Think of Snowflake like a waiter standing at the table, hunched over the diner, predicting how fast they’re going to eat the meal.
Why is fifty-seven percent cRPO important? It’s up from fifty-three percent the prior quarter, and fifty-two percent in the quarter before that, the December quarter. In other words, the company is predicting that its customers are eating faster than they had three months ago, which is good for revenue.
In fact, back to the March report of last year, the first time the company disclosed cRPO, this was the highest rate of cRPO thus far.
Scarpelli has in past told the Street not to rely too much on RPO, nor cRPO, because they can only tell you so much. And I would agree, since it’s the obsession with such metrics that helped the Street reach a really unhealthy level of tech valuations in recent years, before this year’s plummet.
But the surge tonight suggests many can’t help looking at cRPO and getting a warm good feeling. Plus ça change…
Also likable in the report was the company’s adjusted free cash flow margin of twelve percent, in line with its pledge to steadily improve profit. Remember that during the company’s meeting with analysts back in June, Scarpelli implied Snowflake had gotten religion, promising it would try harder for profit. (The “adjusted” part means Snowflake backs out options expense.)
I’m delighted, of course, at the big surge in price after hours, given that it now gives Snowflake a big gain of twenty-seven percent since the inauguration of the TL20 on July 15th, just behind number two Hubspot and number one DigitalOcean.
Other names reporting Wednesday from the TL20 were II-VI, which had a solid report and outlook but sold off today after reporting in the morning. The stock had been the standout gainer heading into the report, so, not totally surprising. Many traders were likely collecting some profits today.
Nvidia also reported, after having warned two weeks ago of weak results, dragged down by gaming. The company tonight offered a weak forecast for the current quarter as well, sending its shares down mildly in late trading. The analysts this evening are referring to this as the “kitchen sink” forecast, the thing that’s, hopefully, needed to relieve investor anxiety and allow them to focus on the positives.
We shall see.
The weak part of the business is not AI, which is going great guns, up sixty percent in the quarter, but rather video game chips, sales of which are under pressure.
During tonight’s conference call with analysts, the company had to address crypto-currency. Given the current crypto “winter,” the sense among all analysts is that some portion of the video game weakness is a result of people doing less Bitcoin mining, a function that consumes a lot of GPUs that also are used for video games.
CFO Colette Kress told the Street the company really doesn’t know how much to attribute to the crypto winter:
As noted last quarter, we had expected cryptocurrency mining to make a diminishing contribution to Gaming demand. We are unable to accurately quantify the extent to which reduced crypto mining contributed to the decline in gaming demand.
There’s one more name from the TL20 left to report: Pure Storage, coming August 31st.
TL20 holds its head above water thanks to II-VI
Aug 23, 2022
This is why you pick a portfolio of stocks, not just one.
The chart above shows the five-day trend of the stocks in the TL20 group of stocks, and also their cumulative return since inception on July 15th. On the left is the total gain each stock had as of five trading days ago, since inception, and on the right is where they stand in total gains at Monday’s close.
Monday was a big day of declines for most shares, the Nasdaq Composite dropping almost three percent.
The chart for the past five days is ugly, but I’ve highlighted how one name bucked the trend. Fiber-optic component vendor II-VI, which had been initially a laggard in the group, has risen about three percent in the past five sessions, the only one of the twenty to see a net gain. It’s nice how one good stock will step to the fore when other names in the portfolio are weakening.
II-VI is up eleven percent since the TL20’s inception.
Other high-flyers such as Hubspot and Block dropped big-time these past five days, some of it from people likely taking profit on their sharp run-ups, some of it from Monday’s deep disfavor.
Hubspot had been up forty-three percent a week ago, and is still the best TL20 name. Despite double-digit declines this past week, it’s up over twenty-five percent. Block is still up about eleven percent despite its sharp fall this past week.
The second chart, below, displays the TL20’s collective performance relative to benchmarks since inception. The picks have managed to eek out a small gain relative to most, while the ARK Innovation ETF, and Bitcoin, are currently trailing sharply.
As mentioned in the inaugural briefing on the TL20 group of stock picks, the group is really a hypothetical portfolio, it’s not an actual trading vehicle. And so, when I make comparisons from time to time to other investments, one has to be aware of that fact because it makes most comparisons not apples-to-apples.
For example, at the top of the TL20 page, and in every update, I compare the TL20 performance as a weighted market-cap composite. That composite is created automatically by FactSet. The actual price of ETFs, in contrast, may fall below the value of their holdings.
In any event, roughly speaking, the TL20 is exceeding all its relevant benchmarks.
Confluent CEO: Kafka is essential even in tough times
Aug 23, 2022
With concern about recession still prevalent, companies are being scrutinized about how well their products and services can hold the attention of customers should things really head south.
“People are trying to figure out what sticks around in tough times,” says Jay Kreps, co-founder and CEO of software maker Confluent, which came public a year ago. “It’s hard for them to tell, there are lots of products in the data space.”
Kreps and I were talking via Zoom last week following a highly successful second quarter report August 3rd by Confluent that sent the shares up eighteen percent after-hours.
Confluent sells a commercial version of the open-source Kafka software, a rather abstruse piece of technology that is becoming increasingly important to firms of all sizes in many industries.
Investors, Kreps tells me, have been mulling the deep question, “What really are the things where purchases are going to continue to happen, where customers keep expanding and renewing?”
It would appear, based on the latest results, that Confluent can claim some portion of the stuff that companies are willing to continue to buy whatever their economic concerns.
The second quarter report, the company’s fifth quarterly report, was also its fifth beat on sales. Although this beat on sales was smaller than the previous ones, it was still eighteen percent higher than the Street was expecting.
One particular data point stood out, the company’s revenue from cloud computing. Confluent sells Kafka software as both a right to use “on premise,” in a company’s own facilities, and also as a managed service inside public cloud facilities. The cloud version, roughly one third of Confluent’s quarterly revenue, rose by a hundred and thirty-nine percent last quarter, more than double the already very high total growth rate of almost sixty percent.
The durability of the company’s cloud revenue is important because Confluent sells it as a “consumption” pricing product, meaning, it bills customers not at a pre-ordained time, like the beginning of each quarter, but only as they use the software.
That means revenue from cloud has an unpredictable element. Some customers have a minimum commitment, but others can use it in an open-ended way. There has been a concern about how well something would hold up that is billed on a usage basis.
The disparity is clear in one of the ever-popular software metrics, the company’s “remaining performance obligation,” or RPO. That is a measure of the total value of deals done, to date, stretching years out into the future, that has yet to be recognized as revenue.
Last quarter, RPO rose eighty-one percent, faster than revenue, and totaled $591 million, greater than this year’s Street consensus for revenue for Confluent of $570 million. There’s a lot that’s signed but not yet recognized, and a lot of it that has no definite time frame.
So, given that disparity between what’s signed and what might be realized as revenue, there’s uncertainty for investors. If Confluent customers can hold off realizing that spend, they might do so, which would be tough for Confluent.
“There was a lot of talk” among investors “about these consumption models where you pay if you use more or less,” Kreps observes. Investors, he says, have been asking, “What happens to that [consumption] if there’s a tighter economy? Doesn’t everybody, kind-of, dial down their usage and spend less?”
Happily, “that really wasn’t what happened to us,” he says. Instead, “we’ve actually accelerated,” enough so to boost the revenue outlook yet again. “It wasn’t a surprise to us,” he says of the favorable trend, “but I think it was unexpected to some investors.”
"If you give up on either one of those,” growth or profit, says Kreps, “you have a highly efficient small business, or you have a big mess, where you’ve built something very inefficient.”
Not only were the reported results higher, Confluent raised its year revenue outlook and narrowed its forecast for its expected net loss for the year. It was the third time Confluent has raised its year outlook, and its current promise, a range of revenue of $567 million to $571 million, stands well above the original forecast for $511 million.
Kreps’s observation about this is that the Kafka software is more urgently needed than some other kinds of programs.
“We tend to serve these mission-critical software apps that are significant areas of investment,” he says. “I think that criticality tends to help when it comes to tighter times, which is probably what we’re entering now.”
As I explained in my first interview with Kreps, in September, the Kafka software is a form of “middleware” software — basically, plumbing for data. Kafka moves data to where it is needed by a given application. Kreps calls it “data in motion,” and you can think of it as a way to “stream” corporate data from one app to another — kind-of like how a person streams Netflix shows from a server to their PC or phone.
Confluent has been able to articulate to the C-suite how that data in motion will help a bank company or a manufacturing company with their respective projects. “We’ve gotten smarter at really conveying, in these different industries, what’s going to be meaningful,” he says. “They’re doing different projects around data in motion, but the high-level capabilities are very similar.”
One thing that may be helping Kreps and team, he suggests, is that there just aren’t that many alternatives right now. “There aren’t that many ways to do what you do with Kafka,” he says, even though cloud operators Amazon and Microsoft have some streaming middleware products, such as “Amazon Kinesis” and “Azure Event Hubs,” respectively.
Mostly, competition is companies trying to build their own, and it is that effort that Kreps has sought to replace with his company’s offerings. Assembling middleware is a little bit like building the pyramids. Having a vendor to go to such as Confluent can take some of the load off a company, observes Kreps.
“I think the move to cloud services is going to continue in a more recessionary time,” he says, “just because people get much more real about what they want these teams of very expensive software engineers working on, and it tends to be the things that are unique to their business, for unique competitive advantage, versus basic infrastructure capabilities.”
Even if companies wanted to just grab the open-source Kafka software and use it, Confluent is more efficient, he argues.
“These cloud services are usually very TCO-positive [total cost of ownership] versus a team of Kafka experts” in-house, says Kreps.
For a young company that has a novel technology, it is often the case that the person running the show is very close to the product. Kreps and colleagues built Kafka when he was at LinkedIn, almost eight years ago. This past quarter, he took over an additional role, as head of product development, on an interim basis, as Ganesh Srinivasan stepped down after four years. The search for a permanent replacement is underway.
As both CEO and now product lead, one of his main goals is to refine an essential aspect of Kafka as a commercial product, ease of use.
“If I think about what do we want to do as a company — there’s a paradigm shift towards streaming data in motion,” Kreps says. “Our job is to make that easy to harness and get the value out.”
When Kreps and team at LinkedIn developed Kafka, as a skunk-works project, and released it to the world as free, open-source code, it caught on first among the most-tech savvy companies, namely, tech companies, such as Uber. They had the smarts to run it.
“At the very early days of the company, it was powerful but hard,” recalls Kreps with a laugh. “You had these tech companies that could use Kafka at very large scale, but with a very heavy investment in engineers to operate it and build it into all their operations.”
If something’s valuable, observes Kreps, “people will eventually do it,” but maybe not anytime soon if it’s too difficult.
One way to make it easy is the cloud service — “doing it for them,” he says. “We’ve made remarkable steps on that.”
But “the other dimension is how do we make it really easy to build applications that harness data in real time?”
That’s “something new,” he says. It involves things such as having recently added the venerable SQL query language that runs most databases. “I think this is an area that’s still in the early days,” he says. “I think we want to make it something where you can take advantage of your data in motion really quickly.”
In that light, Kafka, and data in motion, is the proverbial “new paradigm,” a technology that is still evolving in practice.
Another way to make something easier to use is sometimes to make it free. In the June quarter, the company took down its paywall for using its cloud product. Meaning, to try out the product on an evaluation basis, you no longer have to plunk down a credit card at all, just click download. That is something of a return to the early days of software distribution, when “freeware” programs were put out in the world by startups with no up-front charge, as a way to build an audience.
Something like that is necessary for the way programmers want to consume things now.
“The newer generation of software engineers, their expectation is just to start with the cloud version, and have it be really easy to build with,” he explains. “If there are too many credit cards and sales conversations, if there’s too much friction in the system, you don’t get it.”
I’m stunned to hear Kreps say that taking away a credit card requirement is “taking away friction.” After all, it wasn’t so long ago that Amazon’s revolutionary cloud computing proposition was, Just plunk down your credit card and get going. Apparently, even that is now too much for people.
“It’s the nature of things,” says Kreps with a laugh. “It goes from blogs to YouTube to TikTok — the progress of humanity is toward shorter attention spans, whether we like it or not.”
The fact is, in a larger corporation, he says, “It can be relatively hard to get something new on the company credit card.” Early in its use by a prospective, then, the goal for Confluent must be to make it very easy.
Already, he says, removing friction is paying off. “We saw great growth in sign-ups” in the quarter for the trial version, he notes, rising by fifty percent from the prior quarter. “That is quite significant.”
Now, he says, “it’s on us to make sure all those people have a great experience and become paying customers over time.”
Lowering the paywall, and getting more people in the door, can lead to what Kreps has told the Street are the “network effects” of Kafka, where one project starts to pull in others, in a virtuous cycle. Such follow-on use can have real financial benefits.
Confluent has what’s called “dollar-based net retention” of more than one hundred and thirty percent. DBNR, as it’s referred to, is a measure of how much much stuff a customer buys in a single month versus in a prior period. A rate of one hundred thirty percent is very healthy in the software world, given that anything over one hundred and twenty percent is really good.
By bringing more people into the fold with free sign-ups, there’s a greater chance to convert them to the customers who will buy more and more, on average.
Given that Kreps has this deep feel for the product, and how it’s used, I ask him if finding a replacement for Srinivasan as product lead will be like hiring himself.
“No, I don’t think so,” he replies. “The goal is to scale what we do but not to have ten clones of the CEO.”
Instead, he says, “there’s really a learning process to finding new people.”
“I always come in with a bunch of opinions” in the search for someone, he says, “But then you encounter people of all sorts.”
It really comes down to “what things do we want to be excellent at, and if we have thirteen things we want to be excellent at, that’s kind-of the same as having none.
“So, the early process is about meeting people, understanding what they’re about, and getting really, really clear about what the organization needs.”
Despite five quarters of great reports and great forecasts, and an unusual degree of strength amidst the concern over software sales generally, there is one thing that might concern investors: profit, or lack thereof.
The Street expects Confluent to lose money through 2024, the average estimate being a non-GAAP loss per share of eighteen cents that year, on Ebitda of negative $32 million, and a small free cash flow loss of $6 million.
What does all that mean given that the investing world is supposed to be in a somewhat changed landscape, where profit matters more than it did nine months ago?
“It’s a balance,” is Kreps’s response.
“We are going after a really big space, and TAM [total addressable market], and it’s important to capture that market,” he says. “At the same time, you have to build a healthy business in the process, and that’s hard to do.”
He adds, with a chuckle, that "if you give up on either one of those, you have a highly efficient small business, or you have a big mess, where you’ve built something very inefficient.”
Confluent, says Kreps, has tried to be “thoughtful” about its intended “framework” for profit. “Exiting Q4 of 2024, we expect to be operating margin-positive,” he says. “That’s the timeline we’re marching to.”
Meantime, there have been a number of improvements in profit metrics he says, that “have been well-received.” Among them, the company’s non-GAAP operating profit margin improved by eight points last quarter, year over year, to 33.5%. That was a result of being “proactive” about spending in the quarter, among other things, CFO Steffan Tomlinson told the Street.
The company’s gross profit margin on its cloud portion of its business, moreover, “improved substantially,” Tomlinson noted. The company doesn’t disclose the actual cloud gross profit margin, but says it is lower than overall corporate gross margin. Efforts to reduce hosting costs and to get better pricing for its product from the public cloud operators with whom Confluent partners have lead to improved economics, said Tomlinson. The Street adores those kinds of “improving unit economics,” as they’re generally called.
Bottom line for Kreps, as he puts it, is that “you don’t want to stop investing in the engineering and R&D talent to build the future of the space,” meaning, the epic pyramid-building project. “Now is the time,” he says. “You want to do it in a way that’s thoughtful, but go after it.”
Oh, and, “it helps we are extremely well-capitalized,” he adds. Indeed. With one of the biggest IPOs in recent memory last year, with net proceeds of over three quarters of a billion dollars, the company ended last quarter with two billion in cash on the balance sheet, he notes.
“We are not in a position where we are up against a wall to raise money again,” he points out.
Confluent shares are down sixty-four percent this year, and up forty-five percent in three months.
Cambium is ready to ‘ride the new growth curve’
Aug 20, 2022
Wireless networking vendor Cambium Networks has been such a drama for the past year that it has been difficult to talk about anything other than the supply chain, and the havoc it has wreaked.
Happily, the clouds have begun to part the last two quarters, as the company got more of the parts it needed to ship product on time. That positive turn afforded me an opportunity last week to discuss with CEO Atul Bhatnagar things other than just the immediate obstacles.
“We sense there is a new S-curve for growth emerging for Cambium,” says Bhatnagar, in a meeting via Zoom.
“The last time we entered an S-curve was 2016, and every five or six years, there’s a technological change, driven by customer needs, driven by new applications.”
Cambium, in case you’re unfamiliar, is what used to be the wireless networking business of Motorola Solutions, which split off from the handset business of Motorola, which was sold to Alphabet in 2011 (and later sold to Lenovo). The company got started when private equity shop Vector Capital took the parts of Motorola that had been developed for broadband corporate networking and combined them into one company that same year.
What Bhatnagar was describing to me with his S-curve statement is the idea that companies are entering a building spree with wireless networks, to proliferate the ways that connectivity is available to their employees and to other parties. It could include high-speed Internet to the pool area of a hotel, or a way to send 5G cellular connections from the local loop back to a central office.
I’ve covered these ideas a few times with Bhatnagar in our interviews over the past two years, though a lot had been obscured by the supply-chain problems. The report the company offered this month for the June quarter saw revenue come in slightly — half a percent — above expectations, which was a big relief after a six-percent miss in the March quarter.
Although the outlook for this quarter was a couple percent below what analysts have been modeling, at least the company was able to maintain its full-year outlook. That was a big relief after the company cut its outlook in May by twenty percent.
Basically, as Bhatnagar had said to me in February, the company’s business is crawling out from under the supply-chain situation. The stock, at a recent $20.50, has anticipated this turn to the positive, rising fifty-eight percent in three months, though the shares are still down forty-two percent the past twelve months.
Now, some of the focus shifts from problems back to where the business is going in broad strokes. Bhatnagar is fond of the expression “playing a long game,” and another expression, by hockey player Wayne Gretzky, about “skating to where the puck will be, not where it’s been.”
To him, the S-curve is the culmination of several trends that he’s been discussing with me now for two years, amplified by the effect of pandemic lockdowns that have made people more dependent on bandwidth everywhere.
“Three years back, if you gave someone fifty or sixty megabits per second, people were reasonably happy,” he reflects. “But all the work-from-home, education from home, gaming going on in the homes, the need now is hundreds of megabits per second.
“To provide the new need, you need these new highways coming into your community, basically, from a single-lane highway to now a four-lane highway — that’s the way to think about it, that’s why there’s a technological shift.”
The multiple elements of Cambium’s approach to that include WiFi 6, the latest version of the WiFi networking technology standard, which has been wildly popular. Sales of WiFi 6 are expected to increase Cambium’s enterprise sales by more than forty-five percent this year, a figure the company revised upward this month from its prior outlook.
And then there’s radio systems for building out 5G networking in a “fixed” fashion — not for handsets, for building-to-building, and infrastructure — in cities, either for extending the fiber installation of a carrier’s network in the last five kilometers, the edge of a network, or for carriers who are “back-hauling” 5G networks, meaning, connecting the part that the users see back to the central office of an operator.
“The way these new growth curves work,” explains Bhatnagar, “if you’ve been innovating multiple years, and preparing the ground for the next generation technologies, you can ride the new growth curve” for years by reaping the fruits of hard work in product development.
A key element of these products is they are serving the opening up of new, very rich swaths of the electromagnetic spectrum being made available to carriers and enterprises, including what’s called the “C band,” at 4 gigahertz to 8 gigahertz, with a special sweet spot at 6 gigahertz. The new electromagnetic spectrum provides the ability to give every user multiple gigabits of wireless connectivity in a way that’s affordable for carriers.
“Anticipating what frequencies, what bands, what communications standards, is a very important aspect of predicting where the puck is going to be,” Bhatnagar tells me.
“When I say a new S-curve growth curve, that’s the next three, four, five years; the new technologies will provide the new growth curve” for Cambium.
So far, the enterprise WiFi business, lead by WiFi 6, is already on the S-curve path with that forty-five percent-plus revenue growth this year. The overall shape of the business, impacted by the supply problems, is in decline this year, with revenue expected to drop fourteen percent. If the various markets come together as Bhatnagar is suggesting, there is a prospect after this year of much higher growth.
In 2021, for example, sales rose twenty-one percent. And back in 2017, during the last S-curve to which Bhatnagar refers, sales rose twenty percent. The Street models a return to twenty-one percent revenue growth next year.
“The way these new growth curves work,” explains Bhatnagar, “if you’ve been innovating multiple years, and preparing the ground for the next-generation technologies, you can ride the new growth curve” for years by reaping the fruits of hard work in product development. As he points out, “these products are not created in one or two quarters; most of the time these products take us two years to three years to create, hardware and software.”
In other words, hopefully, the future is about the payoff that has been happening. The radios for fixed 5G service in the 28-gigahertz spectrum band, a product that has been delayed and that has been eagerly awaited by Street analysts, began shipping in March, and has steadily racked up multiple early deployments with carriers, including for things such as connectivity in stadiums.
That product had originally been designed to transmit a hundred megabits per second download speeds. After consultation with customers, Cambium increased that to four hundred megabits per second. “Now you can do honest-to-goodness, hundreds of megabits at scale, cost-effectively,” he says of the 28-gigahertz product.
The product, to Bhatnagar, serves what he believes is the company’s pledge to “focus on state-of-the-art performance while keeping the economics very attractive” for his customers to own and operate. Higher bandwidth, he says, can be a direct boost to medium-sized telco service providers who are trying to increase how much they get for wireless broadband from low rates such as $30 per user per month.
With the 28-gigahertz product, and with new products coming in the fourth quarter for the 6-gigahertz spectrum uses, “starting in Q4, we have the ammunition, and that sets the stage for the growth S-curve to start in ’23, ’24,” he says.
All of the S-curve talk points to an interesting other part of the business, one that gets little attention and is not well understood: software.
Cambium operates a cloud computing-based management service to run the boxes it sells, called “cnMaestro X.” That control panel in the cloud is a service that brings additional fees from customers, but the actual revenue amount is never disclosed by the company, it’s just lumped in with product sales.
The company last quarter for the first time disclosed one single data point, vague but promising: the amount of “annualized recurring revenue,” or ARR, that the company booked from customers for cnMaestro X rose by sixty-six percent from the first quarter to the second quarter. ARR is one of those typical software industry bullet points that companies throw around. It is the value of all the current contracts for software if they were stretched out twelve months into the future.
Without citing an absolute dollar amount, the ARR figure is of limited value. But, it’s a start. “We are beginning to now measure that,” says Bhatnagar of the software business. “By ARR you can see it is still small numbers, but it is growing now very well.”
More broadly, he says, “We have been diligently working on increasing the software revenue for the company,” adding “it’s beginning to get traction with customers.”
“We are now adding value-added content, value-added software, we are able to monetize more effectively, and that journey is just starting.”
In addition to cnMaestro, the company is soon rolling out what it calls the “network services edge,” which adds a capability for various cloud computing applications to talk to the Cambium network equipment’s software.
“That is going to bring in some very key APIs for SaaS applications,” says Bhatnagar, using the terms for “application programming interface,” the way that software communicates, and “SaaS,” the common rubric for cloud. “We are going to make it easier for SaaS applications to use network services and provide some key network services.”
The end result of such software smarts, he says, should be to simplify what his customers, the medium-sized enterprises, have to manage.
“Now, for mid-sized enterprises, instead of disparate boxes, lots of them, you have a few, tightly integrated, managed from a single pane of glass,” he says. “And as a result, the cost of ownership is far lower because you don’t need complex IT infrastructure to manage that.”
The ease of use of software, claims Bhatnagar, should start to more closely tie together all the company’s products in a kind of building-block fashion. The name Cambium has given that portfolio is “Cambium One,” a pitch meant to suggest that the various items work gather more easily.
“A lot of mid-sized enterprises, they like the simplicity, they like the fact it just works, like Lego blocks, letting things snap together without complex IT resources, it's a good value.”
Leaving aside what customers think, my years of watching small companies grow suggests the language of Cambium One is a certain maturation of the product line. A company starts with one product, then moves to several, and at some point, it makes sense to help customers buy a lot of stuff as one integrated system. The software element, and the design of the various parts as a portfolio, shows that kind of maturation.
Also part of that maturity are the lessons learned during the supply chain mess, which may make Cambium a more robust operating outfit going forward.
“We can do things now we could not do before,” says Bhatnagar when I asked how the supply chain issues affected the company. Those new things include using a larger balance sheet to buy chips from secondary sources.
It includes finding other ways to deal with challenges.
“For example, last quarter, we knew that shipping logistics would be tight after Shanghai and Shenzen opened up after lockdowns,” he says, referring to Chinese centers of production for the company’s products. “So we air-freighted products” rather than using sea freight.
“I think, those are our abilities now which five years ago were very limited,” he says. “So, absolutely, responding to the supply-chain challenges, we are in a much better position.”
Cambium was on a path to some very promising sales before the supply-chain disruption up-ended things in 2021, sending the stock down more than fifty percent at one point.
With the shares down forty-two percent in twelve months, but up fifty-seven percent in just the past three months, it seems the Street is getting comfortable with the notion the company is past the worst. If the S-curve is real, that could make things quite interesting going forward.
“Our stock does what our stock does,” says Bhatnagar, when I ask one of my favorite questions, whether it’s a good buy.
“We need to do the right things, the right innovation, have the right people, and long term, the stock will reach its potential.
“As we execute, as things come to pass, the EPS [earnings per share], the revenue, the growth — the numbers will do the talking.”
Wolfspeed soars, joining On Semiconductor in the silicon carbide winners circle
Aug 18, 2022
In addition to Cisco Systems, among other reports this evening, Wolfspeed, the chip maker that I profiled in May as key to the march of silicon carbidetechnology in electric vehicles, this evening reported a ten percent revenue beat for the June quarter, its biggest top-line surprise in years.
The stock soared by eighteen percent in late trading.
Silicon carbide, in case you haven’t read the lengthy opus cited above, is semiconductor technology that has remarkable properties of electrical conduction. That makes it a kind of wonder material for the traction inverter, the part in every electric vehicle that converts direct current in the battery into alternating current in the motor.
Wolfspeed’s great report follows a strong report earlier this month from a silicon carbide competitor, On Semiconductor. When I was assembling the TL20 last month, I said to myself, these are both excellent companies, and neither of them has seen a big discount to their stock valuation.
Well, I guess the joke is on me. Shares of On Semi and Wolfspeed are up thirty-one percent and forty-one percent, respectively, based on tonight’s after-hours leap in Wolfspeed.
The more important point is that the electric vehicle market continues to be a very important one, especially for these two chip suppliers. It’s quite possible the positive trends will last.
The moral of the story is that when you are plugged into an important trend in tech, as I have been with silicon carbide, don’t shy away, stick with it. Lesson learned.
Cisco, your small business barometer, is flashing green
Aug 18, 2022
The fiscal fourth-quarter report of computer networking giant Cisco Systems Wednesday afternoon was uneventful, in a couple of interesting ways. The numbers were fine, but also not really remarkable given that they were set against a backdrop of lowered expectations.
And, more important to the average stock analyst, Cisco’s report contained no trace of weakness among small businesses, which is important given that Cisco has always been a kind of mood ring of the broader economy.
The results are a good sign for TL20 pick Arista Networks, Cisco’s main competition, and a good sign for tech and for the economy broadly speaking.
Back then, CEO Chuck Robbins told the Street that “The weakness got a little bit worse as you just sort [of] went straight down, as you would expect with small business, medium business and even smaller-size enterprises that were — didn’t perform as well as the very largest of enterprises.”
There was none of that this time. The economic details on tonight’s conference call with analysts were uneventful — happily so.
Robbins told the Street the company had a record number of product orders in the July-ending quarter, and just closed the second-strongest year in its history for revenue. “We had small business growth in Q4,” noted Robbins, “I think it was double digits [percentage revenue growth] on a global basis, which is a good sign that that's continuing to grow.”
For the revenue outlook for 2023, which was higher than expected, sales are expected to be “strong across our portfolio,” he said. “Right now, we're modeling for just the things to continue as we see them,” he said.
And it’s not just in the U.S. Robbins remarked that “I think, Asia and Europe, we had conversations with our team this week and they seem to — they continue to be reasonably optimistic as we are in the Americas.” The health of Cisco’s market is global, at the moment.
Robbins was challenged by one analyst on tonight’s call, Paul Silverstein of Cowen & Co, who has a long history covering the networking market. Said Silverstein to Robbins, “Once upon a time, networking in general, and Cisco, in particular, was the canary in the coal mine — when there was a macro downturn, you guys were the first to see it.”
Not this time. “I think you're right,” was Robbins’s reply. “And I've said this repeatedly: I think what's driven this is there are two things,
Number one, the pandemic revealed the impact of not keeping your core infrastructure technology up to speed […] the second thing is all of these mega trends that I mentioned in my opening comments, I mean, these customers are rearchitecting their entire infrastructure for the first time in years to deal with hybrid cloud and to deal with all of the mobile workers. We see IoT exploding.
So, what does one conclude from all that? Small businesses, for the time being, are buying stuff. That’s good.
And second, if Robbins is right about what’s happening in tech, networking is in a special place that is getting a lot of attention.
I would suggest all those things are good for Arista. For one thing, Cisco’s growth is nowhere near as high as Arista’s.
The Cisco numbers are the second non-event. When you look at Cisco’s reported numbers, keep in mind, they had already come down by billions of dollars.
The final total of revenue for the fiscal year ended last month, $51.6 billion, was almost four billion dollars lower than what the Street had been expecting in February. That big cut was the result of the fact that Cisco has been constrained by ongoing supply chain issues, which limit its ability to ship product and collect revenue.
So, Cisco is doing better than it would appear, but the numbers are still not anything that’s going to blow you away. The higher-than-expected revenue forecast for this year, four percent to six percent revenue growth, is partly an effect of fulfilling orders in backlog. It’s a little bit of a rear-view mirror, in other words.
Moreover, analysts tonight pointed to some weakness in the quarter in Cisco’s sales of equipment into the “campus” network, the main equipment that ties together corporate computers. You could imagine that might be the result of more companies being “virtual,” spending less time in the office or not even having an office at all.
“I mean, clearly, we have competition in the campus. We have very strong competition. We always have. And — but given the volume of the products that we're shipping right now, I don't feel like we're losing significant share.”
Well, we shall see. Arista’s still small in that campus market. Time will tell.
Among other reports this evening, Wolfspeed, the chip maker that I profiled earlier this year as key to the march of silicon carbide in electric vehicles, this evening reported a ten percent revenue beat for the June quarter. The stock soared by eighteen percent in late trading. Read more here.
Note: Names in yellow are TL20 companies.
Note: Names in yellow are TL20 companies.
Amplitude CEO: building the machine for the journey to a billion dollars
Aug 17, 2022
Sooner or later — and better sooner than later — a successful young company needs to bring in new kinds of expertise if it is to mature past its initial success.
Such is the case for ten-year-old, San Francisco-based software maker Amplitude, whose co-founder and CEO, Spenser Skates, tells me his company has brought in the perfect individual to take the company from a quarter billion dollars in revenue annually to a billion dollars and beyond.
Thomas Hansen, the company’s president, joined last month to jump-start the company’s efforts around selling. Hansen comes with big company experience, including Dropbox, but also startup experience, including software maker UiPath.
“He’s seen the journey to a billion dollars in revenue before, that’s where we are going with Amplitude,” says Skates in a chat via Zoom not long after the company’s earnings report on August 3rd.
The view to a billion dollars is important for Skates, who has told me in each of our meetings so far — we first spoke following the company’s IPO in September — that he is focused on a “long-term” mission.
The report August 3rd was the company’s fourth quarter exceeding revenue and net loss per share estimates since the IPO. It was also the second quarter in a row in which the company raised its outlook for revenue for the year, and narrowed its net loss expectation.
“We posted some great results, which we’re really proud of, and we’ve seen a lot of traction for Amplitude in spite of the macro because we are just so critical to how people build their products,” Skates tells me.
Amplitude makes tools for software developers that help them understand how their apps are being used, and how the apps could be better if a developer made adjustments. If you use a smartphone app such as Calm for sleep and meditation instruction, Amplitude’s tool tells the Calm app’s developers how people move through the app, what features they use and don’t use, how they respond to new features, etc. — a kind of diagnosis of what is working and what is not.
Your usage is being measured, and Amplitude’s analysis is constructing the map of your habits, along with thousands, even millions of other users.
You could say the same sense of measuring and improving apps is evident in how Skates regards his own business. Having gotten this far in selling stuff, he says, "you need to set up a machine for the next level of scale.”
What does that mean?
“When you are a company of $50 million in revenue, and you have thirty account executives and sales people, that’s what we’ve been used to doing,” explains Skates. “We have set up the forecasting and pipeline generation and deal review for that scale.”
But, as things get larger and larger in an organization, new challenges emerge.
“How do you make sure everyone is really clear what they’re accountable for, for the quarter?” he asks, rhetorically. “How do you make sure that the goals for the very top level of the company cascade down through the different layers of the organization? How do you make sure all the functions from marketing to sales to customer success are coordinated?
“All those challenges are different at scale: as we get close to a thousand people at Amplitude, they are qualitatively different from when you were small.”
With Hansen’s coming aboard, Skates tells me the company now has the muscle to sell to larger and larger accounts, and to sell a broader array of product.
Hansen had taken UiPath to almost nine hundred million dollars in revenue last year, on course to over a billion this year.
In particular, Skates lauds "his ability to hire great leadership talent” at Dropbox and UiPath. “A really key part” of reaching a billion dollars in revenue, annually, “is getting the right leaders in all of go-to-market at that size and scale, he’s done that before,” says Skates, referring to the rubric for sales and marketing and customer support, “go-to-market.”
“All the talent he’s brought into his previous worlds is exceptional,” adds Skates.
The building of a machine of selling is important now not only because the company is becoming larger but also because the offering at Amplitude is becoming more extensive.
“I’m here to drive the success of the company for the long term,” says Skates, when asked about stock price performance. “If you’re focused on that, we’d love to partner with you to get you to be an owner of our stock.”
Skates is the kind of CEO who is “close to the metal,” as they say. He is captivated by the finer points of product development, as opposed to some CEOs who look at things from on high as more of a financial problem.
When we last spoke, in May, he told me his company was on a journey to expand its product line, to, as he put it, “capture lighting in a bottle twice.” The portfolio has so far consisted of three products, the “Analytics” program that started things off, and two newer offerings, called “Recommend” and “Experiment.”
This past quarter, the company added what’s called “Customer Data Platform,” or CDP, a kind of enveloping infrastructure that routs data between the analytics tools and other parts of the sales and marketing software toolkit.
For example, CDP will make it easier for a company to connect its applications to what’s called “A/B testing,” a practice of systematically changing aspects of the program for one group of users and not another, to test what works better.
“It will eventually be commoditized over time,” says Skates of CDP. There are CDP offerings from Salesforce and Adobe, but by having the program, Amplitude customers can use his company’s software without having to go to those other vendors for another thing to make it all work.
“You can get started with Amplitude much faster, and the tools integrate more deeply.”
The expansion of the product platform, the arrival of new sales talent, and the raised forecast speak to an operation that is humming along toward what is expected to be $235 million in revenue this year. The Street, of course, will still worry about the macro-economic risk.
To Skates, that matter has in a sense been dealt with. In February, Amplitude’s shares plunged by sixty percent when Amplitude cut its outlook for full-year revenue. The current forecast is now above that February forecast, and Skates tells me the cut in outlook at the time was a rather shrewd move.
“I think in a lot of ways we were ahead of the rest of the market in that we were very conscious that a lot of folks had gotten overly optimistic about SaaS markets,” he says, using the term for software-as-a-service, the rubric for cloud software. The downbeat forecast was a “reset,” and “since then we’ve been able to raise it twice, which has been fantastic — that’s a much better place to be as opposed to having to reset mid-stream.”
That doesn’t guarantee Amplitude won’t yet encounter challenges, of course. Skates insists that the results for this first half of the year convince him that even in a rough macro-economic climate, “investing in product data is the one thing that people are continuing to do now.” People are unlikely to pull the plug on Amplitude versus some other software, he believes.
The fly in the ointment for some investors will be the lack of profit. Operating margin is still negative, and though the company’s free cash flow swung from negative to positive last quarter, that was an outlier. The Street expects free cash flow will continue be negative for Amplitude for at least the next two years, and Skates reiterated in our talk that the positive number was a one-time blip.
“We are setting up, in three to five years out, we do want to be at positive ten percent free cash flow,” says Skates.
I point out to Skates that this year, we are in a regime where profit is much more important to investors, something that CEOs of other software companies, such as Dynatrace and DigitalOcean, make sure to feature in their conversations with investors.
“We’ve always had a more balanced view of that, where we want to set ourselves up for long-term growth and a sustainable business,” says Skates when I point out that attitude.
“When you are this early, you still want to make sure to make bets for the future of the business as opposed to being focused on cash generation.”
Everyone, he observes, wants to have “Rule of 60,” the Street short-hand for having a mix of, say, thirty-percent revenue growth, and thirty percent free cash flow margin. “I don’t think any SaaS business at our size is at Rule of 60,” he says.
Amplitude, moreover, “has never been a growth-at-all-costs company,” he contends, “we’ve always had a balanced approach.” The company has over three hundred million dollars in the bank, he points out, and “we’ve always been very lean,” and so, Amplitude doesn’t need to “significantly adjust our spend profile,” he says.
“Consequently, it’s a great time to build market share,” he says. “I feel great about the plan that’s in front of us.”
I ask Skates if the losses bug his audience on the Street.
His response is to turn to the long-term view he offered me in May.
“I talked about this in my founders letter” when the company went public, he notes. “I’m very much thinking about how do we set this up for the long term, because if we do that right we’ll be incredibly successful.”
Skates is something of a maverick, I’m reminded. He thumbed his nose at the IPO process last year, opting for a direct listing of shares instead of an underwriting syndicate. He is still telling investors to respect that his company in a sense marches to its own drummer.
“I specifically said, don’t buy our stock if you want us to sell the company,” he reminds me. “It’s the same thing today: In spite of what’s going on with the markets and everything on the outside, the trends driving our business are the same trends; we’re leading in our category; we have the best product out there; and if you just continue focusing on, compounding that, you’ll do really well.”
Amplitude, he says, “want investors that have that mentality, that are along for the journey with us,” he says.
I point out that Amplitude shares, despite a nice bounce following the earnings report, are down sixty-sixpercent this year. Is the stock a good buy? I ask.
Skates, as he did when we spoke at the time of the IPO, demonstrates a detached appreciation for finance and markets. “Prices are a great mechanism,” he tells me. “They contain the complete information about what people think is the fair value for the long term,” adding, “I think a lot of the fluctuation you see is the change in the price of capital, for the time-value of money.”
That’s smart, but how does that apply to Amplitude stock? Skates pushes the question of price aside.
“That’s not something I’m an expert on; I’m here to drive the success of the company for the long term; if you’re focused on that, we’d love to partner with you to get you to be an owner of our stock.”
Pressing the matter, I point out the recent stock price, $18.19, represents a drastically reduced forward sales multiple of less than seven times the next twelve months’ sales, way down from over eighteen times back at the beginning of the year. Does that make the stock more attractive?
Skates offers me a reply consistent with where he’s been coming from. “We’re good to buy if you’re a believer of the very long-term of the trends that are driving Amplitude’s business.”
TL20 jousts with the ARKK ETF, lead by Hubspot, DigitalOcean
Aug 16, 2022
It’s been a month since the introduction of The Technology Letter 20 group of stocks, and things are going pretty well. The TL20, based on a return since inception of 18.71 percent — calculated by FactSet as a weighted average of the twenty by market capitalization — is not only head and shoulders above the Nasdaq Composite and the S&P 500, it’s also above several other relevant measures.
TL20 tops the iShares software ETF (ticker “ISV”); it tops the Philadelphia Semiconductor Index (ticker “SOX”), even though the TL20 has mostly chip names; and it’s even above the return of Bitcoin, even though Bitcoin has been lately clawing its way back from the low twenty thousand per Bitcoin to mid-twenties.
Perhaps most interesting to me, TL20 has been trading places on several days with Cathie Wood’s ARK Innovation ETF (ticker “ARKK”).
As you can see from the chart, the TL20 since inception on July 15th is a bit above the return of the ARK ETF, 18.71 percent versus 18.32 percent. ARKK has on a few days surpassed the TL20.
Both TL20 and Wood’s ETF have benefited from gains in some high-flying software stocks that have done well the past month. The two top performers of the TL20 are Hubspot, up almost forty-three percent, and DigitalOcean, up almost forty percent.
The ARK ETF, in that same span of time, has seen the biggest gains from Coinbase, the crypto-currency exchange, up seventy percent; DraftKings, the sports franchise, up sixty-two percent, and Unity Software, up sixty percent. Unity makes tools used by video game creators to write and deploy their games.
I would say that at this point, the primary distinguishing trait between these two vehicles’ holdings are the number of flame-outs in Wood’s ETF.
The ARK ETF has had a bumpy ride with some problem names this past earnings season, such as Roku and Twilio, both of which are fine companies but which had disappointing quarterly reports, sending their shares down by double digits.
As a consequence of big one-day drops, ARKK has had more substantial swings up and down; TL20 has not been as erratic.
ARKK is down forty-five percent since the beginning of this year.
Keep in mind, there are subtle ways the two may vary in how the weighting is computed for the prices of the two. The comparison is interesting, if not entirely an apples-to-apples comparison.
The Metaverse’s extraordinary stumbling block
Aug 13, 2022
Jensen Huang, co-founder and CEO of Nvidia, the company that dominates 3-D graphics, used the annual SIGGRAPH 3-D conference this week to tout the impending arrival of what he and others call the next stage of the Internet, basically, a 3-D playground. A chief impediment to that vision is that the technology is incredibly resource hungry, making it unusable by the vast majority of computer users. Huang and team say that they have a plan for that.
This week saw the big annual conference for 3-D animation technology take place up in Vancouver, called SIGGRAPH. The show has been the venue for decades for breakthroughs in the art of 3-D movie-making such as the techniques made famous in Jurassic Park.
This week, Nvidia, a company founded in 1993, the year Jurassic Park debuted, told us that there was something really, really big, as big as Jurassic Park, to pay attention to at the show. I’ve covered the details at ZDNet.
That big, big thing that Nvidia is excited about is something that has so far been a whole lot of nothing, the Metaverse, the much-hyped future world that Meta’s Mark Zuckerberg has said will be the next incarnation of the Internet.
Nvidia makes GPU chips to make possible 3-D animation, both in movies and in video games, and now it is partnering with Meta and Apple and lots of other companies to make 3-D applicable to the Metaverse.
In this vision of the Metaverse, the Internet becomes a playground of interlocking 3-D worlds where everything is rendered in striking detail. Maybe you view it on your phone’s screen, maybe you view it in some future VR goggles. Your likeness might be instantiated as a 3-D character in that world, like the characters in Toy Story, called an avatar.
For Nvidia, it’s like 3-D is finally growing beyond a niche of big-budget movies and video games to invade all of the world’s connected existence. Now, that would be a great new opportunity for Nvidia.
Except, it doesn’t seem like that’s going to work. This week’s fanfare at SIGGRAPH was, for Nvidia, the perfect distraction from the company’s pre-announcement Monday morning that its sales of video game chips are weak at the moment. Part of the reason those sales are weak is because gamers, and 3-D movie studios, are niche markets. They are not small, they are worth billions to Nvidia, but they are still not mass-markets. Those markets go through cycles: customers invest in the Nvidia chips, and then they are sated for a while, and they hold off on buying.
And so, the problem of making the Metaverse in a sense boils down to the question, How do you take a niche market like 3-D and turn it into a broad consumer phenomenon? That’s the problem that Nvidia and its partners face with their Metaverse plans.
And therein lies a very difficult technical challenge.
The devices most people have in their possession today, such as the current crop of phones, don’t have the processing power nor the battery capacity to run those kinds of intense computer graphics. In many cases, they don’t have the download speeds, either, to fetch from a remote server the intense graphics data that the devices need to render those 3-D scenes.
There isn’t enough computing power on most computers, and won’t be any time soon, to handle all the graphics a Metaverse will entail.
That’s a big deal when you think about how consumer technology evolves. When the Web arrived in the early 1990s, around the time of Jurassic Park, the Web was simple enough that it was compatible with the computers of the time. It was mostly text with some low-res pictures. The hardest part was getting a dedicated “PPP” connection from the phone company. Once you had that, you could use most any computer to browse the Web.
The Web began in simplicity, and economy, and evolved. The Metaverse, in contrast, if it is ever to begin at all, proposes that everyone adopt a lavish, all-or-nothing world dependent on tremendous resources.
How can that happen without expecting the rest of the world to become video gamers or movie studios and shell out hundreds of dollars for the latest GPUs?
“That’s a really, really good question,” said Rev Lebaredian, Nvidia's head of its Metaverse effort, when I asked him that question.
Lebaredian, echoing my concern, first pointed out that “3-D, even though it’s just one more dimension than just 2-D, you might think it’s just 50% harder, it’s hundreds or thousands of times harder, and it has an insatiable appetite for computing power.”
The short answer from Lebaredian is that there’s going to have to be a way to do most of the computer processing in the cloud, in remote data centers, and have only a little bit happen on the device. And Nvidia is working on that technology, what it calls “Omniverse.”
As Lebaredian explained,
The solution to this, ultimately, is to move as much compute as possible into the cloud. We are always going to be constrained by the amount of compute you can have on a device, especially one you put on your head, or in your pocket. And it’s never going to be enough. We’ve already recognized that for virtually all other types of applications, non-3-D applications. Everything you use today, from email to search to maps, most of the compute behind that is done somewhere else, it’s not done on your device. The same needs to be true for 3-D. We’ve built Omniverse to enable this pattern, where we can break apart all of the computation necessary for 3-D worlds into components, into micro services, and move as much of it as possible to the appropriate computers in the cloud. And for the ones that need to be closer to you, you can run that compute — just that compute — near you.
All of which sounds good, except when you consider that the stuff that appears on your screen, the 2-D, and now the 3-D stuff, a bunch of polygons, is all still done locally, for the most part, on your computer, be it a desktop or a handheld. And that stuff is done either in simple fashion, if its a smartphone, or in hyped-up fashion if it’s a $2,000 gaming machine.
The user interface has always taken place locally, on the device itself. To do otherwise is extraordinarily difficult.
And so, the explanation that Lebaredian offered, while it makes sense, sounds to me like it’s right back to square one. There isn’t enough computing power on most computers, and won’t be any time soon, to handle all the graphics a Metaverse will entail. And there isn’t enough bandwidth between those computers and the cloud to crunch those polygons in the cloud and then ship them to you as a kind of description for you to render in real time, with low enough latency for it to have the illusion of reality.
I think this is a non-starter. There are all kinds of obstacles in front of the Metaverse. As I’ve written in past, it seems unlikely that Zuckerberg and the other parties involved have the kind of sensibility to create a world humans really want to inhabit. But this technical issue is the most concrete stumbling block so far.
Now, if I were to offer the contrarian view, I could put on my stock-trading hat, and suggest there is a brilliant opportunity here. Nvidia could partner with Qualcomm, the makers of cellular chips, to make sure that ever more and more of the infrastructure needed for Lebaredian’s vision is pushed closer and closer to every individual. It would demand Nvidia GPUs to sit right next to wireless 5G chips from Qualcomm. In fact, I could imagine one day both companies merging, forming a kind of powerhouse of wireless processing and transmission.
The Metaverse would finally be the perfect “killer app” for the 5G bandwidth that Verizon and others have been pitching for several years now.
That computing power would get closer and closer to you by being set up inside of Amazon cloud computing centers, what Amazon calls “local zones.” As Amazon’s CTO, Werner Vogels, recently told me, those local zones are already running more and more compute for smartphone functions of wireless carriers. Amazon’s cloud is now “getting closer and closer to every individual,” he remarked.
I would make that argument if I believed that things could come together that way. I’m skeptical, though. Especially about wireless. Every time there’s been a new “G” in cellular, such as 3G and 4G, it has mostly been used by carriers to achieve greater operating efficiency, for the purpose of saving money. It hasn’t been used to dramatically improve experiences. Sure, connections today are much faster than twenty years ago, but it’s taken that long to get reasonable Web-surfing on a smartphone.
To get the Metaverse on a smartphone, it might take another twenty years or more, long past most investors’ investment horizon.
Color me dubious on this whole thing. I’ll be interested to be proven wrong.
DigitalOcean CFO: ‘We are masters of our own destiny’
Aug 11, 2022
“There is no need for a prima donna,” says Bill Sorenson when I ask who could replace him.
Monday, Sorenson told the Street he is retiring next year from his post as CFO of DigitalOcean. He was kind enough to talk to me by phone about the company’s second quarter report, and where the company goes from here.
Usually, I talk with the CEO of DigitalOcean, Yancey Spruill, following the company’s reports. This week, it was Sorenson.
“Yancey has been such a great leader,” he tells me. “I’m very excited about the trajectory of the company.”
I was glad to make Sorenson's acquaintance given his many years of experience in tech, and given that he’s from Sheepshead Bay in Brooklyn, New York. Being a New Yorker myself, I appreciate getting the no-nonsense picture from someone who’s from what used to be a really tough neighborhood.
Sorenson’s no-nonsense answer when I asked him who’s going to replace him (the board is actively looking for the next CFO) was that they should be a team player who doesn’t put on aires.
“Culture is important, we are one hundred percent remote,” he points out, meaning staff work from everywhere. “It’s important to have a fabric, a mission, a commonality to hold everyone together,” and whoever comes aboard “will have to fit seamlessly into that fabric.”
The ostensible subject of our talk was the report on Monday, but the stock had already in a sense eclipsed that report. The financial results were modest, with revenue for the quarter missing consensus expectations, and the outlook for this quarter merely in line with what analysts have been modeling.
The stock, however has rocketed in weeks leading up to the report, and continued to rise following it. DigitalOcean is one of the TL20 stocks to consider, and it has been the best performer since the July 15th reference date of the TL20, up almost fifty percent in that time at a recent $52.67.
“Q2 was an example of our ability to be disciplined on spending,” says Sorenson. “If we have to weather this storm, we can weather it.”
That is despite some substantial concern in the market that DigitalOcean is going to be hurt by any small business weakness in a potential recession.
In case you’re not familiar with the business, DigitalOcean sells a cloud computing service that is a more-economical alternative to Amazon and Microsoft and Google. It is David going up against the Goliaths of cloud.
What the move upward in the shares may in part reflect is a conviction among investors that, as Sorensen tells me, “We are masters of our own destiny.”
“DigitalOcean’s key verticals include Blockchain, SaaS builders, Video, Streaming, and Web Agencies,” she wrote, “and we think developer/SMB [small and medium business] activity in each of these end markets is likely to slow.”
Amidst such apprehension, it’s interesting to hear Sorensen say the company is master of its destiny. His chief point is that the company got a heck of a lot of money before the IPO window slammed shut this year.
“We have $1.2 billion of cash on the balance sheet,” he points out. No need for follow-on offerings any time soon, in other words, something he is “so glad” about, he says.
What is great about talking to a CFO who was in the room when deals were done is hearing the epic tale of getting money in a roaring market, a market that won’t be as easy again, probably, anytime soon.
“When I joined we had a bank facility that was the primary funding for the company,” recalls Sorenson. “And we determined how much cash we were going to need, and we quickly went out and we up-sized that deal.”
The company then “quickly started laying the groundwork to go public,” realizing that the company’s venture investors, including Andreessen-Horowitz, “were past their horizon,” about six to seven years, and wanted a return soon.
Following the IPO in March of last year, the company did a convertible debt offering of one and a half billion dollars. “We were just primed and ready, the convert market was just hot as a pistol,” he recalls.
“We went out thinking we might raise $850 million, and they kept calling me back, saying, You want another hundred million more?”
“So, we raised very quickly $2.5 billion” between IPO and converts, he says with a bit of a chuckle.
“Now we are in a great position to use that money to help us grow the business,” he says. “We are not dependent on third-party sources.”
The other part of the story, of course, is not just the balance sheet but what keeps getting added. DigitalOcean is one software maker that is generating positive free cash flow while many have either never reached profitability or are hoping to get back to profitability some day.
“It cracks me up about CEOs that re-affirm their commitment to being cash flow positive in 2025,” says Sorenson. “There’s a lot that can go wrong between now and then.
“We don’t have to worry about profit and free cash flow, we are there, we’ve crossed that hurdle.”
In a market that is “enormous,” perhaps over a hundred and twenty-five billion dollars in value, annually, the challenge for DigitalOcean is to preserve that profit profile while also investing in the business.
“Q2 was an example of our ability to be disciplined on spending,” says Sorenson. “As you look forward, you’re thinking about things that take a longer time to pay off, and what we’re trying to be disciplined in is the appropriate mix. What do we need to invest for the future? What do we absolutely need to invest for the near term? And within that envelope, demonstrate improving profitability.”
What happens in a tight spot, the kind that Goldman’s Borges is concerned about?
“Your question is a good one,” he replies. On the one hand, the customers are “sticky,” as they say in the software business, especially those who pay more than $50 per month to use the service, notes Sorenson. Those customers are growing the fastest, and they become somewhat embedded in the service because they come to depend on it.
“People come on, they build a business, they build an application, they’re not turning it off tomorrow,” he explains. “While their spend levels may adjust, they’re not basically picking up and moving someplace else because the product itself is still a fifty percent discount” to the larger cloud operators.
Nevertheless, “If we get in a tough environment — we’re starting to see one — we certainly have demonstrated an ability to control our costs,” says Sorenson. "And I think if we get into a situation where there’s continued pressure, we’ll continue to be disciplined to basically not go backwards in terms of our overall profitability.”
Not going backward is good, of course, but the real intention, he says, is to carry on with Spruill’s “mantra,” to “systematically improve the overall operating profile of the company.”
Spruill and Sorenson on Monday told the Street the company will produce operating profit margin this year, on a non-GAAP basis, of fifteen percent to sixteen percent. The company expects to have a free cash flow margin of nine to ten percent.
Over the coming years, the intention, says Sorenson, is to scale those margins to “mid to high twenties” for operating margin; and scale free cash flow margin to “above twenty percent,” all while maintaining thirty percent or better per annum revenue growth, on a path to a billion dollars in revenue in 2024.
“I’d say the company is incredibly fired up because they see the opportunity ahead of us.”
To steer toward those goals, says Sorenson, it will be important that his successor not only not be a prima donna, but also that the person be able to handle that growth trajectory the company aims for.
“If you take that growth trajectory to 2026, you’re doubling again to two billion [dollars],” he says.
“The thing that is going to be important is someone who has helped build organizations to several billion dollars of revenue.”
In past, CEO Spruill has been freewheeling with me in interviews when addressing stock valuation, assuring me at every turn the shares are a good buy. I point out to Sorenson the shares this year have come down a bunch in valuation, to just under nine times as a multiple of enterprise value (market capitalization less net cash) divided by projected revenue.
Is that the right multiple? I ask Sorenson.
“That’s a great question,” he says. “We are in select company where we are generating real free cash flow, and it’s growing; we are improving profitability; we are targeting sustainable revenue growth of thirty percent-plus — I think that puts us in very attractive company to be investing in for the long term.
“We have seen in markets over the past six months that there is a greater focus on cash and profitability, and we are poised to take advantage of the good market.
“If we have to weather this storm, we can weather it.”
Wednesday was a nice stock market session, when many companies with so-so results nevertheless turned it around and saw shares climb.
And then, there was cancer diagnostics developer Invitae (ticker “NVTA”), which, after reporting a disappointing outlook on Tuesday evening, and seeing its shares rise a modest four percent in late trading, on Wednesday soared by two hundred and seventy-seven percent.
That’s right, Invitae, which had lost eighty-five percent of its value this year amidst very mixed financial results, and massive losses, is now down only forty-three percent for the year.
The near-quadrupling in the stock Wednesday came, as I said, following mediocre results, and amidst at least one downgrade. Analyst Julia Qin with JP Morgan cut her rating on the shares Wednesday to “Underweight” from Neutral, arguing that it now seems more uncertain whether the company can reach a “long-term” growth target for revenue of fifteen percent to twenty-five percent.
San Francisco-based Invitae, which came public in 2015, and which has been selling its services since 2013, offers genetic testing for a variety of conditions, though testing a person’s DNA for hereditary cancer has been the bulk of that work.
The company’s value proposition has been to make genetic testing more accessible to the average individual, to spread the benefits of such testing. The founders, chairman Randy Scott and director Sean George, are genetics veterans, having previously worked at a number of firms in the business, including Genomic Health and Incyte.
The outlook on Tuesday, as I said, was not good. The company is expected to burn through over half a billion dollars in cash this year, and still its revenue for this year, about half a billion dollars, is expected to come in below consensus. Next year is expected to bring another couple hundred million in cash burn. Nothing to get excited about.
The news flow, moreover, has been tumultuous this year, with co-founder George replaced as CEO in July with the company’s COO, Kenneth Knight, amidst a broad restructuring.
The volume Wednesday was as remarkable as the price surge, a total of almost two hundred and thirty million shares changing hands, roughly equivalent to all the basic shares outstanding, and over two thousand percent of the thirty-day average daily volume.
All of this, moreover, transpired from about 2:15 pm, eastern time, till close, so, less than two hours. The shares were trending down slightly in late trading.
Amidst the tribulations of Invitae this year, one of the most interesting things, of course, is that the biggest single holder of the stock is Cathie Wood’s ARK Investment Management. ARK holds almost thirteen percent of the shares, according to FactSet, a position that has been increasing throughout this year, including a purchase of 813,000 shares on Monday, in advance of the earnings report.
Is it possible for Wood to have achieved something substantial on Wednesday by purchasing another very large percentage? At prices between $4 and almost $9 per share, it would, of course, cost tens of millions of dollars, which is a lot to pay in one fell swoop.
Not impossible, of course, though it’s also possible NVTA is simply the “meme stock” of the moment. If so, things could quite easily come apart in coming days. Bloomberg opinion columnist Jared Dillianon Wednesday had the interesting suggestion that the meme stock phenomenon is now in the hands of warring hedge funds. If that’s true, then it could be Wood going up against her rivals. (Dillian didn’t specifically mention Invitae.)
If you’re considering whether to dip into this, consider that aside from massive losses this year and next, the stock is currently below what is considered the book value of the shares, $12.43, according to FactSet. Decide for yourself if that means it’s a great buy.
Alteryx CFO: ‘You're beginning to see the fruits of everything that we put in place’
Aug 10, 2022
"We took our lumps in 2020 for a variety of reasons, execution being one of them,” says Kevin Rubin, chief financial officer of software maker Alteryx, in what may be deemed a substantial understatement.
For most of the year 2020, Alteryx seemed like the gang that couldn’t shoot straight. The stock had several perilous ups and downs. Then the company brought in a new CEO in October of that year, Mark Anderson, a veteran of large, highly successful firms such as Palo Alto Networks. He had a plan to revamp the way the company sells its software, a more mature, focused approach able to support very large deals with enterprise customers.
Flash forward, Alteryx is one of the more popular names in a stock market suddenly scrutinizing software makers more heavily than it did six months ago. It seems what Anderson has been up to is finally coming together.
“I think you're beginning to see the fruits of everything that we put in place the last seven quarters, that it’s starting to really build momentum,” Rubin tells me in an interview via Zoom following Alteryx’s successful second-quarter report on August 2nd, which sent the shares soaring by nineteen percent the next day.
Alteryx makes data analytics software programs that are meant to “democratize” use of data in an organization. Rather than be stuck with a lame Excel spreadsheet, or, conversely, needing to have a PhD in data science, a person who uses the Alteryx software is supposed to be the everywoman, or man, in an organization, a citizen analyst who can derive meaningful insights with less toil.
I’ve been interviewing Rubin for three years now, and it’s nice to have his perspective. He’s been with Alteryx for over six years, so he has a sense of how things changed with Anderson. The new way of selling, he tells me, really is paying off.
“Going into Q3, we have the strongest pipeline that we've seen in many years,” says Rubin, referring to the mix of deals in process.
“I think I would be misleading you if I said that everything was done” with the company’s sales transformation, “but I think the heavy lifting is certainly behind us.”
A year ago, it was still not clear if what Anderson was bringing to Alteryx was going to work. That month, the stock dropped ten percent on a disappointing forecast, something Rubin at the time chalked up to growing pains.
“This is the third or fourth quarter in a row where we've put up some really good, strong results,” Rubin observes. Indeed, last week’s report was the fourth quarter in a row of sales beating expectations, the fourth quarter in a row of the company’s sales outlook topping consensus as well, and the fourth quarter of the stock jumping on the news.
Alteryx even managed to raise its revenue outlook for the full year, something that stands out when other software vendors such as Datadog and Dynatrace are having trouble maintaining their forecasts.
“Unlike some other commentary we've seen from software companies, we actually saw sales cycles in Q2 improve slightly,” he points out.
“If you're a CFO of a large, Global 2000, and have a list of areas that you're going to spend on in 2023, there's things above the red line, and things below the red line,” explains Rubin. “We believe data and analytics is going to be prioritized above that line.”
What has changed under Anderson, says Rubin, is a more deliberate way to go after prospects, and to build the business with each of them, which is important since most of Alteryx’s revenue in a quarter comes from follow-on sales to existing customers; new business, by contrast, starts small.
“We are landing more strategically than we were doing before,” says Rubin, using the software sales jargon for landing a new customer, part of the tactic of “land and expand,” to get a foot in the door, and spread through a customer.
“If you had asked me that question in 2018, the strategy was a little bit different, where we understood that landing a lot of logos [new customers], not all of them were going to expand with us and we, kind-of, accepted that.” The company went more for the low-hanging fruit, in other words.
Under Anderson, and chief revenue officer Paula Hansen, who came aboard in May of last year, and was promoted this past February, there has been an “operational rigor that we’ve instilled in the business” that treats customers in a more purposeful manner.
“The enterprise focus that we have today is very prescriptive about who are the prospects, and who are the largest companies in the world that we want as customers,” Rubin tells me, “and we have very strong campaigns and initiatives to go after those prospects in that regard.”
That includes having more seasoned sales talent call on enterprises than in past.
“We're not going into Acme Corporation, a large global 2000, you know, with a twenty-seven-year-old who's never sold that company before,” he says. Instead using partnerships, large operations such as PWC, has a big added emphasis for Alteryx. "We're doing a much better job leveraging what the partner network has to offer us in terms of being able to have these higher-level, more strategic, more outcome-driven conversations.” Half of the company’s “annual contract value” in the most recent quarter, meaning, the value of subscriptions it will recognize as revenue over a twelve-month period, was derived from partners, he notes.
As a result, says Rubin, unlike some other firms these days, “I'm not anticipating that we're going to have a deleterious impact of sales cycles going forward.”
As important as selling is the fact the company has “a large concentration of renewals in the back half [of the year],” notes Rubin, meaning, customers renewing their contracts, a factor of the mass of quarterly revenue coming from existing customers. That actually eases some of the constant effort of selling.
It’s good to be a CFO who’s been around, I can see from talking with Rubin, in order to grasp the dynamics and mechanics of revenue.
“We have a history and an understanding of what the renewal cycle generally tends to bring from a net expansion rate,” he tells me, referring to one of those metrics so important to the software world, the rate at which existing customers buy more.
“So, we wouldn't have that guidance out for Q3 and Q4 without having a high degree of confidence that those are levels that this business can achieve.”
The progress under Anderson is evident in the beat and raise, and the revenue growth rate, which the Street expects to rebound to forty-four percent this year from eight percent last year. One can wonder, though, what is going to happen in the fall, the time of the annual ritual of companies making out their budgets for the new year.
“We obviously don’t know what companies will decide to do,” concedes Rubin. “What we do know is data and analytics is a priority.
“So, if you're a CFO of a large, global 2000, and have a list of areas that you're going to spend on in 2023, there's things above the red line, and things below the red line,” he explains. “We believe data and analytics is going to be prioritized above that line.”
Recession is hard to predict, but it need not decimate the business, he believes. “To the extent that, you know, there is a material shift in the world, we still think, given the large opportunity that exists for us within this space, that we will be able to continue momentum and continue to to execute well.”
Despite the security of the sales pipeline, and the current outlook, the company is “watching everything like a hawk,” he says. “We have better visibility, and the ability to course correct, today, than we did in 2020.”
Another potential point in Alteryx’s favor is the possible collapse, in this new credit environment, of some startup competitors not as well funded as Alteryx. Alteryx has almost seven hundred million dollars on the balance sheet, and has demonstrated in past an ability to generate free cash flow.
“Companies in this space that are not well-funded, and their VCs, or investors, are saying, you've got to preserve cash, it’s going to make it harder for those businesses to come out and compete,” he observes, “they’re going to be a little bit on life support.”
Of course, Alteryx could be an acquirer of such stranded assets. “If there is a transaction that makes sense for the business, there's always a way to get it done,” Rubin says. He adds, "I’m not super-excited about using equity at this valuation, so I don't know that, you know, we're going to run out and do a deal that would result in us issuing equity at these levels.”
By “these levels,” he could certainly be referring to the stock’s forward price-to-sales multiple, using its enterprise value, which is just 5.6 times, even after a twenty-five percent move in the stock in the past week. That is cheaper than Alteryx was five years ago.
Do investors, I wonder, think Alteryx is profitable enough? The company is expected to lose fifty-one cents a share on an adjusted basis this year, before returning to profit in 2023. “This business has been profitable in the past, and if you just look at the long term model and the implied cash flow of this business, you know, it's pretty strong,” Rubin insists.
The company’s cash flow has been negative the first six months of this year, but that is partly an effect of the company having seasonally lower “billings,” the signing up of new business, in its first quarter. That results in the June quarter being the lowest one for collections for Alteryx. “As we typically get later in the year, billings are higher, collections are stronger, and you start to see the productivity lever relative to the performance.”
Still, there is a balance to be struck: growth, still very high, has to be a focus of spending even while leaning toward profitability. “We have been investing intentionally in areas of the business like go-to-market and product, and that has been the focus,” says Rubin. “It's not lost on us, and I hope we've demonstrated over time, that we take a very disciplined approach towards where we put money.”
As far as investor attitudes he’s observed, there is no dramatic shift away from growth in software investing, there is no focus on profit exclusively.
“If you're a company that is growing fifty, sixty percent, and showing disciplined investments, investors tend to appreciate and understand that there is investment needed to grow that business.”
So, is the stock cheap? Alteryx doesn’t have a buyback program in place, Rubin notes, as “I would rather put our cash today into growth and accretive opportunities than repurchasing the shares.”
“But, yeah, I don't disagree with you,” he adds, about the cheap valuation.
“I vote with my feet each and every day,” is his bottom line. “I come in and I feel very, very fortunate to be part of this ride.
“I think over time, we have an incredible opportunity to disrupt and transform how businesses perform.”
Voting with one’s feet, meaning, showing up, for a seasoned CFO in a hot market for CFO talent, is, certainly, an interesting proof of conviction, it occurs to me.