Lanner joins NVIDIA GTC 21 to showcase AI platforms
At GTC, April 12th to 16th, Lanner will discuss how AI can be structured in a networked approach where AI workloads can be distributed within the edge networks. We will start from the NVIDIA AI-accelerated customer premise equipment over the aggregated network edge, to the hyper-converged platform deployed at the centralised datacenter.
At Lanner’s GTC sessions you can explore below topics:
AI-powered hyper-converged MEC server enables intelligent transportation services
All innovative fleet services generate massive volumes of data, and that's driven fleet management companies' data centers to be more agile by integrating compute, storage, and networking into one hyper converged infrastructure, which simplifies and consolidates all the virtualisation components through software.
With its software-defined nature, the hyper-converged infrastructure leverages existing hardware storage while adopting a virtual controller to manage the physical devices. Lanner came up with the hyper converged MEC server that seamlessly integrates high performance computing, massive storage, and networking functions into one single appliance.
Powered by NVIDIA T4 Tensor Core GPU, the MEC server consolidates the taxi management tasks, such as emergency call services, video surveillance systems, and location-based services. Highest Storage Density for FX-3420 can record all driving and service data for customer analysis and demand forecasting.
Edge AI inference and NGC-ready server: a hardware perspective
The accelerating deployment of powerful AI solutions in competitive markets has evolved hardware requirements down to the very edge of our network due to eruption in AI-based products and services. For edge AI workloads, efficient & high-throughput inference depends on a well-curated compute platform. Advanced AI applications now face fundamental deep learning inference challenges in latency, reliability, multi-precision ANNs support & solution delivery.
Designed and built in-house by Lanner for secure remote 1/2operation and accelerated workloads with the Tesla T4 Tensor core GPU, the LEC-2290E is validated and edge-ready out-of-the-box for streamlined NGC deployments. NVIDIA GPU CLOUD (NGC) fast-tracks edge AI solutions with its comprehensive catalog of containerised software GPU-optimised for edge-to-core solutions.
Don’t miss out on this event. Registration is FREE and gives you access to all the live sessions, interactive panels, demos, research posters, and more.