Brandon Scott
Hardware and compute infrastructure specialist. Electrical engineering background covering AI chips, data centers, and economics of scaling. Expert on physical constraints of computation.
Brandon Scott covers AI hardware, compute infrastructure, and the economics of scaling models. With a background in electrical engineering from Georgia Tech and an early career designing accelerator architectures, he learned to think in FLOPs, memory bandwidth, and thermals long before "AI chips" became a marketing category. He then moved into a strategy role at a cloud provider, where his job was to make the numbers work: capacity planning, utilization, and the brutal trade-offs between performance and cost.
At AI-Telegraph, Brandon translates the physical and economic constraints of computation for readers who mostly see "inference" as an API endpoint. He tracks the roadmap of accelerators, interconnects, and data center design, and he's blunt about what is feasible at scale versus what only works in a benchmark slide. His pieces dissect claims about "orders of magnitude" improvements, scrutinize TCO models, and show how hardware decisions ripple through model design, inference pricing, and product strategy.
He is particularly interested in the emerging stack around custom accelerators, near-memory compute, and sparse or quantized inference. Brandon's coverage exposes where the bottlenecks really are: supply chains, power and cooling, memory, networking, or simply capital expenditure. His work is aimed at readers who understand that AI is not just algorithms and data, but also copper, silicon, concrete, and balance sheets.