Microsoft's Bing* Intelligent Search with Intel® FPGAs
In today’s data centric world, users are asking more of their search engines than ever before. Advanced Intel® technology gives Microsoft's Bing* the power of real-time AI to deliver more intelligent search to users every day. This requires incredibly compute-intensive workloads that are accelerated by Microsoft’s AI platform Project Brainwave on Intel Arria® 10 FPGAs. Learn how Bing deploys the efficient and flexible architecture of Intel FPGAs to bring users more intelligent answers and better search results.
Artificial intelligence, or AI, is seeing an explosion of growth and applications. The flood of data that is generated and collected every day and the computations required to allow smarter algorithms to sift through that data more cost-effectively drive AI development. Intel® FPGAs offer the ultimate flexibility and performance, providing unique value propositions across all vertical markets for AI deployments.
Learn how Intel FPGAs leverage the OpenCLTM platform to meet the image processing and classification needs of today's image-centric world. Read the Accelerating Deep Learning with the OpenCL Platform and Intel® Stratix® 10 FPGAs white paper.
The Intel Arria® family delivers optimal performance and power efficiency in the midrange. Intel Arria 10 FPGAs have a rich feature set of memory, logic, and digital signal processing (DSP) blocks combined with the superior signal integrity of up to 25.78 Gbps transceivers that allow you to integrate more functions and maximize system bandwidth. Intel Arria 10 FPGAs and SoCs are ideal for a broad array of applications, such as communications, data center, automotive, and other end markets.
Learn more about Intel Arria 10 FPGAs and our performance benchmarking methodology and results. Read the Intel Arria 10 Performance Benchmarking Methodology and Results white paper.
Intel FPGAs help accelerate many of the core data center workloads that process the growing volume of data that our hyper-connected world creates. They can be reprogrammed in a fraction of a second with a datapath that exactly matches your workload’s key algorithms. This versatility results in a higher performing, more power-efficient, and well-utilized data center – lowering your total cost of ownership.
Learn more about how Microsoft* solves the challenges of real-time AI in the data center by leveraging the agility and efficiency of Intel FPGAs. Read the Serving DNNs in Real Time at Datacenter Scale with Project Brainwave white paper authored by Microsoft.