Connect with us

Artificial Intelligence

5 Reasons To Leverage AI in Metal Fabrication




Emily Newton is the Editor-in-Chief of Revolutionized, an online magazine showing how technology is disrupting many industries.

There’s growing interest in using artificial intelligence (AI) in manufacturing to achieve numerous benefits ranging from better worker productivity to less machine downtime. Here are five specific advantages that AI can bring to metal fabrication. They’ll show you what’s possible.

1. Reduce Waste

Laser cutting is a popular and reliable metal fabrication technique. Implementing manufacturing automation while using it could lead to even better outcomes. Many laser cutters on the market work with metals and numerous other materials. They typically include cameras as identification measures. It’s essential that the laser cutter correctly recognize the material. A mistake could create messes or even release hazardous chemicals. In this scenario, having a safety kit including things like gloves manufactured by a company that evaluates what users really need and provides products to meet these needs, and preparing different kits for different hazards, becomes really important.

AI and computer science professionals at MIT recently developed the SensiCut system. It identifies 30 materials with deep learning and an optical method that examines a material’s microstructure with a laser. It can also suggest cutting adjustments or handle surfaces containing multiple materials.

Mustafa Doga Dogan, a doctoral candidate working on the project, said, “By augmenting standard laser cutters with lensless image sensors, we can easily identify visually similar materials commonly found in workshops and reduce overall waste.”

People interested in applying AI to get this benefit should first take the time to see which practices or materials typically cause the most waste. That information can guide the next steps concerning how and when to use AI to cut down on waste.

2. Decrease Equipment Downtime

Unexpected equipment failures can become costly problems for metal fabricators and other industrial factories. That’s a primary reason why more companies use AI in manufacturing with the goal of cutting down those outages.

If leaders get notifications of impending equipment failures soon enough, they can adjust workflows, order parts or take other proactive steps to stop equipment problems from causing shutdowns. Moving ahead with AI-based maintenance gets factories closer to the zero-downtime goal. It can also aid decision-makers in choosing when to replace aging machines and see the best return on investment.

A company’s budget may limit its ability to invest in AI for the improved maintenance of all equipment. In such cases, the ideal approach is to determine which machines fail most often or are out of service for the longest periods. That information gives a good starting point when selecting where AI would get the best results when minimizing maintenance.

3. Meet Rising Customer Demands

AI is also useful for helping metal fabricators deal with increasingly high workloads. For example, aluminum is a diverse material used in everything from disposable food trays to fitness equipment. It’s in continually high demand, but certain societal trends can make people want it even more.

During the COVID-19 pandemic, many people bought recreational vehicles to travel and stay away from home safely. Aluminum and other metals are key components in RVs.

Some metal fabricators publicly disclose their annual production capabilities. For example, one aluminum company offering billet casting and specialty alloy manufacturing can make more than 200 pounds every year at two of its facilities.

It’s worthwhile for factory leaders to see which factors severely limit production ramp-up efforts. From there, they can further explore how AI and manufacturing automation could reduce those obstacles.

4. Explore Complementing Technologies

Committing to using AI in manufacturing may encourage metal fabricators to investigate other advanced technologies that could cause a wider future-oriented transformation. For example, 3D printing with metals can create prototypes more quickly or make on-demand products for customers.

Company leaders interested in using AI in manufacturing may even find existing options for combining it with 3D printing. Massachusetts company Markforged has a cloud-based platform for additive manufacturing that uses AI to function. Incorporating machine learning into the product reportedly makes it smarter with every new part produced. The cloud-based model also means that a 3D printer automatically receives software updates.

If metal fabricators are interested in pursuing AI, additive manufacturing or both of those but lack in-house resources, they should think about working with a service provider. Doing that could mean getting access to purpose-built technologies and well-known companies rather than hiring people with the expertise to create the tools from scratch.

5. Achieve Better Quality Control

Manufacturing automation can also bring significant  gains to quality improvement efforts. In one recent example, John Deere partnered with Intel to use AI to spot defective welds on its tractors and other industrial equipment.

Finding defecting welds is a challenging task, especially due to the fast-paced nature of most industrial assembly lines. However, this AI application uses advanced algorithms to detect problematic welds and stop a robotic welder after finding them.

More specifically, a neural network-based inference engine can spot issues in real-time and make the necessary adjustments before continuing. Plus, the computer-vision camera used for this application is just 12-14 inches away from the welded material.

When people want to get quality improvements with AI, choosing metrics to track before moving ahead with any new product is a good idea. Then, it’ll be easier to determine if the expected gains happen to the expected extent.

Using AI in Manufacturing Can Help Metal Fabricators Succeed

These potential use cases should help people feel excited about the potential of using AI and other manufacturing automation options to enhance operations. Before finalizing any decisions, the affected parties should remember that the advantages may not be immediately apparent. Still, they typically become obvious if leaders allow enough time to investigate how to best use the technology.

Continue Reading

Artificial Intelligence

Snowflake Launches Arctic: The Most Open, Enterprise-Grade Large Language Model



Snowflake generative ai

Snowflake (NYSE: SNOW), the Data Cloud company, today announced Snowflake Arctic, a state-of-the-art large language model (LLM) uniquely designed to be the most open, enterprise-grade LLM on the market. With its unique Mixture-of-Experts (MoE) architecture, Arctic delivers top-tier intelligence with unparalleled efficiency at scale. It is optimized for complex enterprise workloads, topping several industry benchmarks across SQL code generation, instruction following, and more. In addition, Snowflake is releasing Arctic’s weights under an Apache 2.0 license and details of the research leading to how it was trained, setting a new openness standard for enterprise AI technology. The Snowflake Arctic LLM is a part of the Snowflake Arctic model family, a family of models built by Snowflake that also include the best practical text-embedding models for retrieval use cases.

“This is a watershed moment for Snowflake, with our AI research team innovating at the forefront of AI,” said Sridhar Ramaswamy, CEO, Snowflake. “By delivering industry-leading intelligence and efficiency in a truly open way to the AI community, we are furthering the frontiers of what open source AI can do. Our research with Arctic will significantly enhance our capability to deliver reliable, efficient AI to our customers.”

Arctic Breaks Ground With Truly Open, Widely Available Collaboration
According to a recent report by Forrester, approximately 46 percent of global enterprise AI decision-makers noted that they are leveraging existing open source LLMs to adopt generative AI as a part of their organization’s AI strategy.1 With Snowflake as the data foundation to more than 9,400 companies and organizations around the world2, it is empowering all users to leverage their data with industry-leading open LLMs, while offering them flexibility and choice with what models they work with.

Now with the launch of Arctic, Snowflake is delivering a powerful, truly open model with an Apache 2.0 license that permits ungated personal, research, and commercial use. Taking it one step further, Snowflake also provides code templates, alongside flexible inference and training options so users can quickly get started with deploying and customizing Arctic using their preferred frameworks. These will include NVIDIA NIM with NVIDIA TensorRT-LLM, vLLM, and Hugging Face. For immediate use, Arctic is available for serverless inference in Snowflake Cortex, Snowflake’s fully managed service that offers machine learning and AI solutions in the Data Cloud. It will also be available on Amazon Web Services (AWS), alongside other model gardens and catalogs, which will include Hugging Face, Lamini, Microsoft Azure, NVIDIA API catalog, Perplexity, Together AI, and more.

Arctic Provides Top-Tier Intelligence with Leading Resource-Efficiency
Snowflake’s AI research team, which includes a unique composition of industry-leading researchers and system engineers, took less than three months and spent roughly one-eighth of the training cost of similar models when building Arctic. Trained using Amazon Elastic Compute Cloud (Amazon EC2) P5 instances, Snowflake is setting a new baseline for how fast state-of-the-art open, enterprise-grade models can be trained, ultimately enabling users to create cost-efficient custom models at scale.

As a part of this strategic effort, Arctic’s differentiated MoE design improves both training systems and model performance, with a meticulously designed data composition focused on enterprise needs. Arctic also delivers high-quality results, activating 17 out of 480 billion parameters at a time to achieve industry-leading quality with unprecedented token efficiency. In an efficiency breakthrough, Arctic activates roughly 50 percent less parameters than DBRX, and 75 percent less than Llama 3 70B during inference or training. In addition, it outperforms leading open models including DBRX, Mixtral-8x7B, and more in coding (HumanEval+, MBPP+) and SQL generation (Spider), while simultaneously providing leading performance in general language understanding (MMLU).

Snowflake Continues to Accelerate AI Innovation for All Users
Snowflake continues to provide enterprises with the data foundation and cutting-edge AI building blocks they need to create powerful AI and machine learning apps with their enterprise data. When accessed in Snowflake Cortex, Arctic will accelerate customers’ ability to build production-grade AI apps at scale, within the security and governance perimeter of the Data Cloud. 

In addition to the Arctic LLM, the Snowflake Arctic family of models also includes the recently announced Arctic embed, a family of state-of-the-art text embedding models available to the open source community under an Apache 2.0 license. The family of five models are available on Hugging Face for immediate use and will soon be available as part of the Snowflake Cortex embed function (in private preview). These embedding models are optimized to deliver leading retrieval performance at roughly a third of the size of comparable models, giving organizations a powerful and cost-effective solution when combining proprietary datasets with LLMs as part of a Retrieval Augmented Generation or semantic search service.

Snowflake also prioritizes giving customers access to the newest and most powerful LLMs in the Data Cloud, including the recent additions of Reka and Mistral AI’s models. Moreover, Snowflake recently announced an expanded partnership with NVIDIA to continue its AI innovation, bringing together the full-stack NVIDIA accelerated platform with Snowflake’s Data Cloud to deliver a secure and formidable combination of infrastructure and compute capabilities to unlock AI productivity. Snowflake Ventures has also recently invested in Landing AI, Mistral AI, Reka, and more to further Snowflake’s commitment to helping customers create value from their enterprise data with LLMs and AI.

Comments On the News from AI Experts
Snowflake Arctic is poised to drive significant outcomes that extend our strategic partnership, driving AI access, democratization, and innovation for all,” said Yoav Shoham, Co-Founder and Co-CEO, AI21 Labs. “We are excited to see Snowflake help enterprises harness the power of open source models, as we did with our recent release of Jamba — the first production-grade Mamba-based Transformer-SSM model. Snowflake’s continued AI investment is an important factor in our choosing to build on the Data Cloud, and we’re looking forward to continuing to create increased value for our joint customers.”

“Snowflake and AWS are aligned in the belief that generative AI will transform virtually every customer experience we know,” said David Brown, Vice President Compute and Networking, AWS. “With AWS, Snowflake was able to customize its infrastructure to accelerate time-to-market for training Snowflake Arctic. Using Amazon EC2 P5 instances with Snowflake’s efficient training system and model architecture co-design, Snowflake was able to quickly develop and deliver a new, enterprise-grade model to customers. And with plans to make Snowflake Arctic available on AWS, customers will have greater choice to leverage powerful AI technology to accelerate their transformation.”

“As the pace of AI continues to accelerate, Snowflake has cemented itself as an AI innovator with the launch of Snowflake Arctic,” said Shishir Mehrotra, Co-Founder and CEO, Coda. “Our innovation and design principles are in-line with Snowflake’s forward-thinking approach to AI and beyond, and we’re excited to be a partner on this journey of transforming everyday apps and workflows through AI.”

“There has been a massive wave of open-source AI in the past few months,” said Clement Delangue, CEO and Co-Founder, Hugging Face. “We’re excited to see Snowflake contributing significantly with this release not only of the model with an Apache 2.0 license but also with details on how it was trained. It gives the necessary transparency and control for enterprises to build AI and for the field as a whole to break new grounds.”

“Lamini’s vision is to democratize AI, empowering everyone to build their own superintelligence. We believe the future of enterprise AI is to build on the foundations of powerful open models and open collaboration,” said Sharon Zhou, Co-Founder and CEO, Lamini. “Snowflake Arctic is important to supporting that AI future. We are excited to tune and customize Arctic for highly accurate LLMs, optimizing for control, safety, and resilience to a dynamic AI ecosystem.”

“Community contributions are key in unlocking AI innovation and creating value for everyone,” said Andrew Ng, CEO, Landing AI. “Snowflake’s open source release of Arctic is an exciting step for making cutting-edge models available to everyone to fine-tune, evaluate and innovate on.”

“We’re pleased to increase enterprise customer choice in the rapidly evolving AI landscape by bringing the robust capabilities of Snowflake’s new LLM model Arctic to the Microsoft Azure AI model catalog,” said Eric Boyd, Corporate Vice President, Azure AI Platform, Microsoft. “Our collaboration with Snowflake is an example of our commitment to driving open innovation and expanding the boundaries of what AI can accomplish.”

“The continued advancement — and healthy competition between — open source AI models is pivotal not only to the success of Perplexity, but the future of democratizing generative AI for all,” said Aravind Srinivas, Co-Founder and CEO, Perplexity.We look forward to experimenting with Snowflake Arctic to customize it for our product, ultimately generating even greater value for our end users.”
“Snowflake and Reka are committed to getting AI into the hands of every user, regardless of their technical expertise, to drive business outcomes faster,” said Dani Yogatama, Co-Founder and CEO, Reka. “With the launch of Snowflake Arctic, Snowflake is furthering this vision by putting world-class truly-open large language models at users’ fingertips.”

“As an organization at the forefront of open source AI research, models, and datasets, we’re thrilled to witness the launch of Snowflake Arctic,” said Vipul Ved Prakash, Co-Founder and CEO, Together AI. “Advancements across the open source AI landscape benefit the entire ecosystem, and empower developers and researchers across the globe to deploy impactful generative AI models.”

Continue Reading

Artificial Intelligence

Vechain and SingularityNet Combine Blockchain + AI To Drive Sustainability and Build Advanced Enterprise-Grade Tools



Vechain and SingularityNET, industry leaders in blockchain and artificial intelligence (AI) respectively, have announced their strategic collaboration. This partnering of technical giants unites powerful emerging technologies with the potential to radically change how the global economy operates, offering powerful enterprise-grade tools to tackle challenges in the field of sustainability and traditional businesses.

In particular, the alliance holds great promise for vechain’s ambitions with Boston Consulting Group, partners, collaborating on building ‘ecosystems’ wherein individuals and businesses are incentivised to act sustainably. SingularityNet’s AI capabilities offer immense potential to enhance and improve these ecosystems, utilising AI technology to pore over data, and improve their efficacy.

Vechain and SingularityNET intend to launch joint research initiatives to fortify the efficacy of each respective platform and ingrain the pair at the heart of future digital development. The combination of these technologies can equip businesses with intelligent tools, signalling the onset of a new phase in the era of digitisation.

Dr. Ben Goertzel, the visionary CEO of SingularityNET, expressed his excitement for the massive potential of this partnership:

“The last few years have taught the world that when the right AI algorithms meet the right data on sufficient processing power, magic can happen.

What’s even better is when the algorithms, data and processing are decentralized in deployment, ownership and control — which is exactly the sort of magic that’s going to happen putting the SingularityNET ecosystem’s AI algorithms together with vechain’s deep and diverse enterprise data, on the joint, secure distributed processing power of the two networks.

This combined power will be applicable to sustainability as one of our initial focus areas, but in the end extends across essentially all vertical markets. It’s hard to overestimate the potential here.”

Vechain’s CTO Antonio Senatore commented:

“We’re excited to be collaborating with leading Web3 AI platform, SingularityNET, combining our rich streams of enterprise data with SingularityNET’s powerful and versatile platform.”

“Blockchain and AI offer game-changing capabilities for industries and enterprises and are opening new avenues of operation. We look forward to working closely with the SingularityNET team to build out new services and continue to advance the fore of possibility in web3 and sustainability.”

Vechain and SingularityNET are enabling a new, more interconnected and automated world, driving new capabilities in the fields of industry and in particular, for action around sustainability.

Continue Reading

Artificial Intelligence

Deepdub and OOONA Announce Strategic Partnership to Expand AI-Based Dubbing Solutions to Global Entertainment and Media Clients




Deepdub, the leading AI-based audiovisual dubbing and language localization company, today announced a partnership with OOONA, a major media localization software provider. This collaboration will bring Deepdub’s advanced dubbing solutions to OOONA’s extensive entertainment and media clients worldwide.

Through this partnership, OOONA will implement a process for connecting their clients to Deepdub’s services. This will enable media companies and content creators worldwide to instantly access Deepdub’s innovative dubbing solutions. Companies will be able to submit their content localization needs with ease via OOONA’s platform and receive tailored proposals from Deepdub that leverage the power of AI emotion-prompting technology. Going forward, clients stand to benefit from more efficient workflows and access to groundbreaking dubbing capabilities unlocking flexibility and scale.

“OOONA’s unmatched expertise in media localization, honed from providing pioneering management and production tools to the biggest names in the sector, makes them an ideal partner,” said Ofir Krakowski, CEO and co-founder of Deepdub. “This collaboration gives us the opportunity to introduce our advanced AI dubbing technology to new clients across the entertainment industry and beyond.”

OOONA is trusted by leading media localizers, broadcasters and a vast user base spanning over 170 countries. “We continue to stay true to our mission of being the core platform that integrates anything our clients need, including any opportunities AI-based solutions bring for localizing audiovisual assets,” said Wayne Garb, CEO and co-founder of OOONA. “We are thrilled to collaborate with Deepdub and further strengthen the services we provide to our customers globally.”

About Deepdub

Deepdub aims to bridge the language barrier and cultural gap of entertainment experiences for international audiences across TV, Film, Advertising, Gaming and e-learning. We provide a high-quality localization service for entertainment content using deep learning and AI algorithms. Deepdub plugs into the post-production process of content owners and provides an end-to-end solution for all of their localization needs. Deepdub’s team consists of technology entrepreneurs, engineers, and scientists, as well as dubbing and post-production specialists with extensive industry experience. The advisory board features prominent media executives such as Kevin Reilly, who held the position of Chief Content Officer at HBO Max and president of TNT, TBS, and truTV, and Emiliano Calemzuk, the former President of Fox Television Studios.

For more information about Deepdub, visit 


OOONA.Net Ltd ( is a globally recognized provider of professional management and production tools for the media localization industry. Renowned for its state-of-the-art software catering to subtitling, voiceover, dubbing and captioning needs, OOONA’s modular, pay-as-you-go pricing model empowers users to tailor solutions to their unique requirements. Trusted by leading media localizers, broadcasters and a vast user base spanning over 170 countries, OOONA continues to trailblaze advancements in the field of media localization.

Continue Reading


Subscribe to our Free Newsletter

Get Business and Marketing Insights from Experts, only onTimes of Startups!

Your Information will never be shared with any third party