You can’t get away from it; IoT, the Internet of Things.  Connected devices everywhere.  It is estimated that there will be over 50 billion connected devices by 2020.  At this rate, our toasters will be connected to the internet! From your smartphone, you can control the heat level, monitor temperature and look for cool spots, watch the carbohydrates carbonize in real time!

While we may not actually see the connected toaster (at least I hope not), many companies are looking into their own IoT strategies.

Manufacturers are no different.  Industrial IoT (IIoT) is helping manufacturers learn more about their own products.  All this information is helping to create brand new business models, selling outcomes instead of products.  GE is probably the most notable example of this, where they sell flight hours instead of jet engines.  With an immense amount of data generated by sensors on these engines, GE is able to predict failures and analyze performance to fine tune operation.

The question now, is not how can a company utilize this type of operational feedback, but when.  Manufacturers will have to understand how that data fits into their systems engineering.  As of now, this would require a company to apply the digital thread and digital twin philosophy.  In short, the digital twin first accounts for the PLM’ish information leading up to delivery (requirements, simulations, designs, manufacturing processes, etc.).  Then, each physical product will have an instance of the digital twin to record what happens after delivery.  This includes as-built conditions, service modifications, and, here is the IoT connection, operational parameters.

The market may not yet be ready for this level of digitalization (a term I first heard here, at least in this context).  So many companies are still using shared drives and spreadsheets as the PLM tools of choice.  What we discovered is that if a company wants to have a IoT strategy, they have to understand what that entails.  Will the company be ready to change their culture fast enough?  IIoT is a great feedback loop.  However, there has to be something to feedback all that data to.  The promise is there; I think people understand the benefits.

To answer the original, I think PLM may be ready for it, as it matures and incorporates systems engineering, simulation-in-the-loop, and other enabling technologies.  However, I don’t think the market is ready for the marriage of IIoT and PLM.

In the FEA solver world, users come across multiple numerical schemes to solve the formulated stiffness matrix of the problem. The most popular ones among all are the implicit and explicit solvers. In Abaqus terminology they are called a standard solver and explicit solver respectively. Each of these schemes has its own merits and demerits and this blog post compares these two schemes based on several parameters.

For ease of understanding, I am avoiding the use of long and complicated mathematical equations in this post. 😉

        Implicit Scheme

From an application perspective, this scheme is primarily used for static problems that do not exhibit severe discontinuities. Let’s take an example of the simplest problem: Linear static in which any physical situation can be mathematically formulated as:


Here K is the stiffness matrix, x is the displacement vector and F is the load vector. The size of the matrix and vectors can vary depending on the dimensionality of the problem. For example, K can be a 6×6 matrix for a 3D continuum problem or a 3×3 matrix for a 2D structural problem. The composition of K is primarily governed by material properties. F primarily includes forces and moments at each node of the mesh. Now, to solve the above equation for x, matrix K should be inverted or inversed. After inversion, we get a displacement solution used to compute other variables, such as strains, stresses, and reaction forces.


The Implicit scheme is applicable to dynamic problems as well. In the above equation, M is mass matrix, C is damping matrix and the rest are as usual. This equation is defined in real time. Backward Euler time integration is used to discretize this equation in which the state of a system at a given time increment depends on the state of the system at later time increment. K matrix inversion takes place in a dynamic scenario as well because the objective is still to solve for x. Abaqus standard solver uses three different approaches to solve implicit dynamic problems: quasi static, moderate dissipation or transient fidelity. Each method is recommended for specific types of non-linear dynamic behavior. For example, the quasi static method works well in problems with severe damping.

Merits of this scheme

  • For linear problems, in which K is a constant, implicit scheme provides solution in a single increment.
  • For non-linear problems, in which K is a function of x, thereby making it necessary to solve problem in multiple increments for sake of accuracy, size of each increment can be considerably large as this scheme is unconditionally stable.

Due to these reasons, implicit scheme is preferred to simulate linear/non-linear static problems that are slow or moderate in nature with respect to time.

Demerits of this scheme […]

In a world of controlled, strict, non-flexible systems, people start to get creative.  For some, it’s the crushing weight of a massively customized ERP system that somehow spread out to every part of the organization, for others circumventing “inconvenient” safety devices; when things need to get done, sometimes we have to take matters into our own hands.  In the IT world, this is called “Shadow IT,” which is basically any app, software, or program that isn’t under the control (or even known to) the IT department. Users downloading freeware, buying their own software, using cloud services, etc. Even NASA has difficulty reining in their employees when it comes to using non-sanctioned software.

This behavior extends into the design and engineering office as well, perhaps moreso than other parts of an average organization. It’s in their nature to solve problems; it’s kind of the key attribute of an engineering job description! I can understand why – engineers live and breathe efficiency, and being over-encumbered by poorly designed systems is not efficient at all.

Case in point: the Bill of Materials (BOM). How many systems are controlling it? ERP? The design tool? PDM?  Some home-grown custom system?  By and large…no.  Most work-in-process BOMs are done in spreadsheets.  And why not?  Spreadsheet applications are easy to use and share, don’t require much training, and don’t require six levels of approval to implement. Spreadsheets can even be automated with macros, making BOMs configurable and intelligent. Eventually all items do end up in the ERP, though typically not until late in a product’s/project’s/program’s lifecycle.

So, why not stay with the spreadsheets? What if someone edits those macros and an unknowing user doesn’t verify the end result? What if the file gets corrupted? How does the rest of the organization gain visibility to the latest changes? What ensures that the BOM is up to date with the design? Ultimately, the BOM should be controlled in a PLM system, much to chagrin of clever engineers everywhere.  Here’s why.

Just as when the market moved from pencil drawings, french curves, vellums, and other manual drafting techniques to 2D CAD tools, and similarly 2D design to 3D modeling: Change.  Yes, sketching a part is faster than creating the lines and circles – but CAD technology enables updates caused by change much faster than a manual process. The gains of going from 2D designs to 3D models are more staggering.  “You mean one edit and it updates everywhere?  Even the drawing?”  Anecdotally, I had a customer say “With 2D, whenever the shop called, I was worried about what we messed up.  Now, with 3D models, when the shop calls, I worry about what they messed up.

Again, it’s about rate of change, propagation and validation of the change. Spreadsheets cannot do that (unless you have some really wicked cool macros).

With our PLM Analytics Benchmark, we can help your company to assess the nature of your BOM needs, as well as the 16 pillars of PLM. Let us know if we can be of service!

In the simulation community, when it’s time to learn applications of software tools in real-life product development, one of the best ways to do it is to approach other users. The regional user meetings organized by SIMULIA once every year have a similar objective: bring together the user community. These meetings gather users to share their knowledge and experience in advancing methods and technology for finite element analysis, multi-physics, process automation, design optimization, and simulation management. There is also an opportunity to present a white paper, listen to keynote speakers about the value simulation brings in virtual product development, get updates on new releases from the SIMULIA product management team, and  contribute to this success by being a sponsor of the event. As SIMULIA is not so much a product but a portfolio that offers multiple products, SIMULIA regional user meetings are often a conglomeration of various product specific events: Abaqus, Tosca, ISight, Fesafe, Simpack, Simpoe, and the 3DEXPERIENCE platform.

Initially this event was called the SIMULIA community conference, with a single location at the North American Headquarters in Rhode Island. As its popularity grew, the size of the event grew as well, so there was a need to offer it in multiple locations, making it more accessible to regional users. The new series of events was named the SIMULIA regional user meetings, which are now held at different locations every year: Great Lakes, Houston, California, Toronto, Sao Paulo (Brazil).

How big this event could be in terms of public gathering! We try to never miss it because of its size and value; a snapshot below from the 2014 event held at Providence, RI speaks for itself.

The 2017 Regional User Meetings are coming up!

For all the information regarding dates, venue, agenda and registrations, please click below.


ENOVIA PLM Essentials, on the 3DEXPERIENCE platform, is a package of essential PLM capabilities for mid-sized manufacturers using CATIA V5, SOLIDWORKS, and/or 3rd-partry CAD systems. Capabilities include CAD and doc management, BOM management, Change Management and Project Management, plus social collaboration; these capabilities help increase revenues, improve product quality, shorten time to market, drive innovation, and achieve a competitive advantage.

Innovation is the only way to stay competitive and profitable, and even to survive. When non-productive tasks can be removed, engineering teams get more time back to focus on innovation and produce more alternatives to decide on the best solution.

We are hosting a live seminar with Dassault Systèmes on September 7th, 2017 at our Novi, MI headquarters. We’ll be covering all of the essentials topics discussed above and showing the value you can get by adopting the 3DEXPERIENCE platform.

Register to join us for a half day of education, networking, and lunch! If you want a closer look, stay after and chat one-on-one with our team. A full agenda is available to peruse on the registration page as well.

Many of our Abaqus customers don’t know that the Computational Fluid Dynamics approach (CFD) is not the only method of modeling fluids in Abaqus. There are many other possibilities and the right approach depends on the physics of the problem. This blog post discusses the multi physics methods of modeling fluids in Abaqus.

  • CFD method: This is the well-known and traditional method for fluids modeling. It’s based on Eulerian formulation, in which material flows through the mesh and can be accessed through the Abaqus/CFD solver. Application example: Flow through exhaust systems.
  • CEL method: This is a coupled Eulerian Lagrangian method primarily used in problems involving unbounded fluids where fluids free surface visualization is required. It’s also possible to simulate interaction between multiple materials, either fluids or solids. This method is accessible through Abaqus/explicit solver. Application example: Fluid motion in washing machine.
  • SPH method: This is a smooth particle hydrodynamics approach primarily used to model unbounded fluids that undergo severe deformation or disintegrate into individual particles. This method uses a Lagrangain approach in which material moves with the nodes or particles and can be accessed through the Abaqus/explicit solver. This method can be used for fluids as well as for solids. Application example: bird strike on an aero structure.

We can compare these three methods against multiple parameters such as materials, contact, computation speed, etc. to understand their applications and limitations:

  • Material considerations:

SPH method is most versatile in terms of material support. SPH supports fluids, isotropic solids as well as anisotropic solids.

CFD is the only technique that can model fluid turbulence

CFD is the only technique to model porous media

CFD and CEL allows material flow through the mesh: Eulerian

  • Contact considerations:


In today’s post, I would like to focus on Functional Modeling.

Plastic Part

I’ve always wondered why this workbench never really caught on. Speaking purely from an FM1 trigram standpoint, it comes with the MCE add-on that most people who have PLM Express have added on to their CAC (CAC+MCE).


FM1 gets you the Functional Modeling Part Workbench.

Functional Modeling Part Workbench

First let’s talk about what it was created for, which is plastic parts or parts with draft, because it could also be used for core-cavity type parts like castings. This workbench is very unique in that you do not necessarily model in a particular sequence order like you would in the Part Design workbench. Modeling in the Part Design workbench is what we would call traditional feature modeling, i.e. create a sketch then make a pad, then add some dress up features like draft, fillets, then shell it out, etc.

Feature Based Modeling

There is nothing at all wrong with modeling this way – in fact, it is how most of this work is done today! Now let’s look at what we call Functional modeling which looks at a shape and incorporates a behavior for a specific requirement. […]

Product development companies need to manage a wide variety of documents in different formats and types as they design, manufacture and support their products. Gone are the days when paper documents used to run businesses. Today everything is digital, but very often these digital documents related to product and product development are created in siloed environments disconnected from product development processes. Document authors often recreate or reenter information from product development into their documents.

If the document authors don’t have visibility into the latest product changes, documents become out of sync with product updates. This impacts critical business processes due to inaccuracies or lack of current data. For organizations working globally, another challenge is the high cost and time involved in building complex documents that have multiple language/regional and regulatory requirements.

Teamcenter addresses this challenge by enabling documents that relate to and support product development to be stored alongside product data and processes. When documents are managed in the context of product data related to parts, or to other documents, companies have a single version control, access control and process control system for the entire enterprise, including product data and documents.

Source material from product data can be accessed and used to create documents like parts catalogs, work instructions, service material, specifications for suppliers, trade studies, or even regulatory filings. The documents can then be delivered  as appropriate to the end user in the required format, whether as a PDF or HTML web page, an interactive web tool, or exchanged with customers or suppliers using an industry standard.

The Teamcenter document management solution is focused on improving the document quality while streamlining the process of document creation and delivery. One of the central themes to this is “Transparent PLM.”

In a transparent PLM approach, users continue to do all their document work in their existing document authoring tools, the like Microsoft Office product suite.  They can also do the PLM activities – including review, approval or version or effectivity tracking, etc – directly from the same Office products.   With users continuing to work with document tools in which they are already proficient, they become more productive and the learning curve involved with a new PLM tool is eliminated. This helps with easy user adoption of the solution without any formal training requirements. […]

There is an excellent story in leadership consulting lore. I’m not sure how true it is, but the lessons derived from it are incredibly valuable.

There was once a detachment of Hungarian soldiers that struck out on a reconnaissance mission from their platoon in the Alps. While they were out, there was a massive snowstorm and the soldiers lost their way – returning was impossible.  The team was worried; they were not prepared for an extended stay out in these harsh conditions, and even if they had been, how would they get back with no knowledge of their location? They had all but given up hope when one soldier, while rummaging through his uniform, found a map. He showed it to the group and a new =found sense of hope came over them. They rallied together, found shelter, and waited out the storm.

After a couple of days, the blizzard finally let up. Wearily, the soldiers set about returning to their platoon. Using the map, they identified various features of the land, and made their way back. Their commander was elated to see them alive and well. When he asked the team how they did it, the soldier showed the commander the map that had not only guided them back, but had also given them the hope to persevere.  Confused, the commander asked this soldier, “How on earth did you find your way using a map of the Pyrenees?”

This story teaches us many things; here are two:

  • Fear and anxiety can lead people to inaction, even to their own detriment (and the effect usually intensifies in groups)
  • Even with the wrong strategy or plan, the chances of success are higher than if there were no plan at all

The second point has many application in the business world.  One I think of most, in terms of our manufacturing customers, is that of their shop floors.  Often manufacturers, especially small and medium sized ones, don’t have a chance to get deep into process planning.  Stations are haphazardly placed, too many or not enough activities are scheduled at stations, new machinery is placed wherever it fits, etc.  All of this causes bottlenecks and a slower time getting things out the door.  As we all know, time is money – especially in manufacturing, where every lost minute, hour, or day translates into lost revenue.

Tata Technologies has an amazing team of technical experts and works with many solution providers that can help manufacturers find their own map. One of the maturity benchmarks we offer is for the “Digital Factory;” contact us to schedule yours.


This post was originally written in January of 2017.

With all the buzz about Additive Manufacturing, or 3D Printing, in the manufacturing world today, there is a lot of mystery and confusion surrounding common practices and techniques. This week’s blog post will address a common type of 3D printing known as Electron Beam Freeform Fabrication (EBF³) .

What is Electron Beam Freeform Fabrication?

It is actually part of a broader category, commonly referred to as a Filament Extrusion Techniques. Filament extrusion techniques all utilize a thin filament or wire of material. The material, typically a thermoplastic polymer, is forced through a heating element, and is extruded out in a 2D cross-section on a platform. The platform is lowered and the process is repeated until a part is completed. In most commercial machines, and higher-end consumer grade machines, the build area is typically kept at an elevated temperature to prevent part defects. The most common, and the first, technology of this type to be developed is Fused Deposition Modeling.

The Fused Deposition Modeling Technique was developed by S. Scott Crump, co-founder of Stratasys, Ltd. in the late 1980s. The technology was then patented in 1989. The patent for FDM expired in the early 2000s. This helped to give rise to the Maker movement by allowing other companies to commercialize the technology.

Electron Beam Freeform Fabrication, or EBF³ is one of the newest forms of rapid prototyping. This technique is performed with a focused electron beam and a metal wire or filament. The wire is fed through the electron beam to create a molten pool of metal. The material solidifies instantaneously once the electron beam passes through, and is able to support itself (meaning support structures generally aren’t required). This entire process must be executed under a high vacuum.

Pioneered by NASA Langley Research Center, this process is capable of producing incredibly accurate parts at full density (other additive manufacturing techniques have trouble achieving, or require secondary operations to achieve similar results). This is also one of the only techniques that can be successfully performed in zero gravity environments.

What Are the Advantages of this Process? […]

© Tata Technologies 2009-2015. All rights reserved.