Facebook parent companyMetawill carry on to invest heavily in itsartificial intelligenceresearch efforts , despite expecting thenascent technologyto need years of work before becoming profitable , society executives explained onthe party ’s Q2 earnings callWednesday .

Meta is “ planning for the compute clusters and data point we ’ll need for the next several years , ” CEOMark Zuckerbergsaid on the call . Meta will need an “ amount of compute … almost 10 times more than what we used to train Llama 3 , ” he said , sum up that Llama 4 will “ be the most advanced [ model ] in the industry next class . ” For reference , the Llama 3 example was check ona cluster of 16,384 Nvidia H100 80 GB GPUs .

The company is no stranger to writing halt for aspirational research and development project . Meta ’s Q2 financials show the society expects to spend $ 37 billion to $ 40 billion on capital expenditures in 2024 , and executives expect a “ significant ” increase in that spending next year . “ It ’s hard to augur how this will trend multiple multiplication out into the time to come , ” Zuckerberg remarked . “ But at this point , I ’d rather risk building content before it is needed rather than too deep , devote the long lead times for spin up new illation task . ”

And it ’s not like Meta does n’t have the money to burn . With an estimated 3.27 billion people using at least one Meta app daily , the troupe made just over $ 39 billion in revenue in Q2 , a 22 % increase from the premature year . Out of that , the society pull in around $ 13.5 billion in profit , a 73 % year - over - twelvemonth increase .

But just because Meta is making a earnings does n’t mean its AI efforts are profitable . CFO Susan Li conceded that its procreative AI will not return taxation this year , and retell that revenue from those investment funds will “ come in over a longer flow of clip . ” Still , the company is “ continuing to construct our AI infrastructure with fungibility in mind , so that we can flex mental ability where we cerebrate it will be put to best exercise . ”

Li also noted that the existing education clump can be easily reworked to do inference project , which are expected to constitute a majority of compute requirement as the engineering matures and more people begin using these model on a daily basis .

“ As we descale generative AI training capacity to advance our foundation simulation , we ’ll continue to build our infrastructure in a way that provide us with flexibility in how we habituate it over meter . This will countenance us to direct training capacity to gen AI illation or to our core ranking and recommendation work , when we expect that doing so would be more worthful , ” she state during the net profit call .