Category "Simulation"

The Dassault Systèmes SIMULIA portfolio releases new versions of its software products every year, and this year is no different. The first release of Abaqus 2017 is now available for download at the media download portal. SIMULIA has developed and broadcast 2017 release webinars to make users aware of new features available in the 2017 release, but those webinars are long recordings ranging from one to two hours each, which can be daunting. This blog post will provide a brief highlight of standard and explicit updates in the Abaqus 2017 Solver. A more detailed explanation of any mentioned update, or answers to further questions, can be obtained either by listening to the webinar recordings at the SIMULIA 3DExperience user community portal, leaving a comment on this post, or contacting us.

Updates in Abaqus Standard

Abaqus Standard 2017 has been substantially improved with respect to contact formulations. Mentioned below are the key highlights of various contact functionalities improvements.

  • Edge to surface contact has been enhanced with beams as master definition. This new approach facilitates the phenomenon of twist in beams during frictional contact.
  • Cohesive behavior in general contact.

General contact has always been useful in situations where either it becomes cumbersome to visualize and define large number of contact pairs, even by using contact wizard, or it’s not possible to predict contact interactions based on initial configuration. The general contact support now includes cohesive behavior, thereby making it possible to define contact in situations shown in figure below.Image1

 

Cohesive contact does not constrain rotational degree of freedoms. These DOFs should be constrained separately to avoid pivot ratio errors.

There have been few other changes in cohesive contact interactions. In the 2016 release, only first time cohesive contact was allowed by default, i.e. either a closed cohesive behavior at initial contact or an open initial contact that could convert to a close cohesive contact only once. In the 2017 release, only a closed initial contact could maintain a cohesive behavior by default settings. Any open contact cannot be converted to cohesive contact later. However, it is possible to change the default settings.

Image1

 

  • Linear complementary problem

A new step feature has been defined to remove some limitations of perturbation step. In earlier releases, it was not possible to define a contact in perturbation step that changes its status from open to close or vice versa. In 2017 release, an LCP type technique has been introduced in perturbation step to define frictionless, small sliding contact that could change its contact status. No other forms of non-linearity can be supported in perturbation steps.  LCP is available only for static problems. Any dynamic step is not supported.

Image1

Updates in Abaqus XFEM (crack modeling) […]

The Dassault Systèmes SIMULIA portfolio releases new versions of its software products every year, and this year is no different. The first release of Abaqus 2017 is now available for download at the media download portal. In this blog post, I provide a brief highlight of updates in Abaqus CAE 2017. A more detailed explanation of any mentioned update, or answers to further questions, can be obtained either by listening to the webinar recordings at SIMULIA 3D Experience user community portal, leaving a comment on this post, or contacting us.

  • Consistency check for missing sections

Abaqus CAE users would probably agree that this mistake happens quite often, even though the parts with defined section assignments are displayed in a separate color. In previous releases, this check was not included in data check runs, so it was not possible to detect this error unless a full run was executed. In the 2017 release, missing regions can be identified in a data check run, thus saving time by eliminating fatal error runs.

image1

 

  • New set and surface queries in query toolset

The sets and surfaces can be created at part as well as assembly level. In earlier releases, it was not possible to see the content of a set or surface in the form of text, though it was possible to visualize the content in the viewport. In the 2017 release, query toolbox includes set and surface definition options. In case of sets, information about geometry, nodes, and elements can be obtained with respect to label, type, connectivity and association with part or instance, whichever is applicable. In case of surfaces, name, type, and association with instances, constraints, or interactions can be obtained.

image1

 

  • Geometry face normal query

In the 2017 release, it is possible to visualize the normal of the face or surface by picking it in the viewport. In case of planar faces, normal is displayed instantly. In case of curved faces, CAE prompts the user to pick a point location on the face by various options.

image1

[…]

417px-the_tortoise_and_the_hare_-_project_gutenberg_etext_19994In a race, the quickest runner can never overtake the slowest, since the pursuer must first reach the point whence the pursued started, so that the slower must always hold a lead.

— Aristotle, Physics VI:9, 239b15

This paradox, as first developed by Zeno, and later retold by Aristotle, shows us that mathematical theory can be disproved by taking the hypothesis to an absurd conclusion.  To look at it another way, consider this joke:

A mathematician and scientist are trapped in a burning room.

The mathematician says “We’re doomed! First we have to cover half the distance between where we are and the door, then half the distance that remains, then half of that distance, and so on. The series is infinite.  There’ll always be some finite distance between us and the door.”

The engineer starts to run and says “Well, I figure I can get close enough for all practical purposes.”

The principle here, as it relates to simulation like FEA, is that every incremental step taken in the simulation process gets us closer to our ultimate goal of understanding the exact behavior of the model under a given set of circumstances. However, there is a limit at which we have diminishing returns and a physical prototype must be built. This evolution of simulating our designs has saved a lot of money for manufacturers who, in the past, would have had to build numerous, iterative physical prototypes. This evolution of FEA reminds me of…

2000px-mori_uncanny_valley-svgThe uncanny valley is the idea that as a human representation (robot, wax figures, animations, 3D models, etc.) increases in human likeness, the more affinity people will have towards the representation. That is, however, until a certain point.  Once this threshold is crossed, our affinity for it drops off to the point of revulsion, as in the case of zombies, or the “intermediate human-likeness” prosthetic hands.  However, as the realism continues to increase, the affinity will, in turn, start to rise.

Personally, I find this fascinating – that a trend moving through time can abruptly change direction, and then, for some strange reason, the trend reverts to its original direction. Why does this happen? There are myriad speculations as to why in the Wikipedia page that I’ll encourage the reader to peruse at leisure.

elmer-pump-heatequationBut to tie this back to FEA, think of the beginning of the Uncanny Valley curve as the start of computer assisted design simulation. The horizontal axis is time, vertical axis is accuracy.  I posit that over time, as simulating software has improved, the accuracy of our simulations has also increased. As time has gone on, the ease of use has also improved, allowing non-doctorate holders to utilize simulation as part of their design process.

And this is where we see the uncanny valley; as good as the software is, there comes a point, if you use specialized, intricate, or non-standard analysis, where the accuracy of the software falters. This tells us that there will still be needs for those PhDs, and once they get on the design and start using the software, we see the accuracy go up exponentially.

If you need help getting to the door, or navigating the valley, talk to us about our Simulation benchmark process. Leave a comment or click to contact us.

 

In the years to come, fuel efficiency and reduced emissions will be key factors in determining success within the transportation & mobility industry. Fuel economy is often directly associated with the overall weight of the vehicle. Composite materials have been widely used in the aerospace industry for many years to achieve the objectives of light weight and better performance at the same time.

The transportation & mobility industry has been following the same trends, and it is not uncommon to see the application of composites in this industry sector nowadays; however, unlike the aerospace industry, wide application of composites instead of metals is not feasible in the automotive industry. Hence, apart from material replacement, other novice methods to design and manufacture lightweight structures without compromise in performance will find greater utilization in this segment. In this blog post, I will discuss the application of TOSCA, a finite element based optimization technology.

The lightweight design optimization using virtual product development approach is a two-step process: concept design followed by improved design.

Design concept: The product development costs are mainly determined in the early concept phase. The automatic generation of optimized design proposals will reduce the number of product development cycles and the number of physical prototypes; quality is increased and development costs are significantly reduced. All you need is the definition of the maximum allowed design space – Tosca helps you to find the lightest design that fits and considers all system requirements. The technology associated with the concept design phase is called topology optimization that considers all design variables and functional constraints in optimization cycle while chasing the minimum weight objective function. The technique is iterative that often converges to a best optimal design.

HOW IT WORKS

The user starts with an initial design by defining design space, design responses, and objective function. Design space is the region from where material removal is allowed in incremental steps and objective function is often the overall weight of the component that has to be optimized. With each incremental removal of material, the performance of the component changes. Hence each increment of Tosca is followed by a finite element analysis to check existing performance against target performance. If target performance criteria is satisfied, the updated design increment is acceptable and TOSCA proceeds to the next increment. This process of incremental material removal is continued until the objective function is satisfied or no further design improvement is feasible. The image below depicts a complete CAD to CAD process flow in Tosca. The intermediate processes include TOSCA pre-processing, TOSCA and a finite element code based co-simulation and TOSCA post processing.

Tosca workflow

During the material removal process, TOSCA may be asked to perform the optimization that provides a feasible solution not only from a design perspective but from a manufacturing perspective as well. For example, TOSCA may be asked to recommend only those design variations that can be manufactured using casting and stamping processes. This is possible by defining one or more of manufacturing constraints available in TOSCA constraints library.

manufacturing constraints

While the topology optimization is applicable only on solid structures, it does not mean TOSCA cannot perform optimization on sheet metal parts. The sizing optimization module of TOSCA allows users to define thickness of sheet metal parts as design variables with a lower bound and an upper bound. […]

Predictive Engineering Analytics is a must in current product design and is required to integrate all the multi disciplinary inputs present in today’s complicated products. In the words of Siemens:

“Predictive Engineering Analytics (PEA) is the application of multi-discipline simulation and test, combined with intelligent reporting and data analytics, to develop digital twins that can predict the behavior of products across all performance attributes, throughout the product lifecycle.”

In the above quote, the concept of a digital twin is important; this is the goal of having a complete digital representation of a physical product throughout its design, manufacturing and in-service lifecycle. Such a digital twin can accurately predict all behaviors of the physical product.

There are five key ways that Simcenter(TM) helps achieve a digital twin.

  1. Simcenter(TM) has an integrated Engineering Desktop environment, which allows all pre- and post-processing operations to be carried out. This environment has best-in-class geometry editing tools, comprehensive meshing, and an ability to associate the analysis model to design data.
  2. The environment is completely extendable and can be scaled from simple to complex problems. The benefits include a common user interface and the capability to create automated routines for common simulation problems.
  3. Simcenter(TM) can be linked into other integrated products and engineering systems. This enables simulation data management and allows links to be established to test data, 1D simulations, and 3D CAD. Engineers now have confidence that the behaviors predicted by the digital twin correlate with real life.
  4. The solution produces a common environment across all engineering departments. This allows knowledge to be captured, automated, and then used by a broader team. Specific solutions and best practices become entrenched in the organization, allowing for more consistent design processes. Training requirements are also reduced.
  5. Simcenter(TM) leverages the extensive industry knowledge and capabilities of the Siemens PLM broader portfolio.

If we look at the specific functions that Simcenter(TM) can cover, here is a quote and graphic from a Siemens presentation:

picture2picture1picture4

Another unique feature of the Simcenter(TM) solution is its open platform capability. The solution can be used as the primary pre- and postprocessor for Siemens PLM Solvers, NX Nastran and LMS Samcef, or for popular third party solvers, such as Abaqus, ANSYS, LS-DYNA, and MSC Nastran. This is illustrated in another graphic from a Siemens presentation: […]

How many times has the first design iteration submitted to FEA modeling passed the design criteria?

The answer is close to zero, but even if it does happen by stroke of fortune, the design is not the optimal design – which means that although design requirements are met and validated by FEA, there is always scope of improvement either in terms of cost or in terms of performance. In general, it is not unusual to reach the optimal design in 15 to 20 iterations.

An analyst know the pain of creating a detailed finite element simulation model. Most of the steps involved, such as geometry cleaning and meshing, are very time-consuming, and they are primarily driven by geometry. Let’s look at the workflow in more detail:

An analyst in automotive industry often performs finite element modeling work in Hypermesh, stress analysis in Abaqus, optimization in Optistruct, and durability in Fe-Safe or N-code. An analyst in the aerospace industry often performs CAD composites work in CATIA, finite element modeling in Abaqus CAE, stress analysis in Abaqus or Nastran, and durability in Fe-Safe. An analyst working in other industries has his own suite of FEA tools to work with. The entire process requires data flow from one simulation code to the other. This means output from one code serves as an input to the other. Quite often this work is also done manually by the analyst.

This means that in situations where optimal design is obtained in 20 iterations as mentioned above, an analyst has to perform geometry cleaning 20 times, create FE meshes manually 20 times, and also transfer the simulation data from one piece of code to the other 20 times. By the time these design iterations are over, the analyst’s face and computer looks somewhat like this:

Let analysts remain as analysts and let simulation robot do the rest!

The traditional job of finite element analyst is to build robust high fidelity simulation models that gives correct results under real life load applications. The analyst is not an FE robot who can perform repetitive tasks with ease. In situations like one mentioned above, it makes perfect sense to let FE analyst create a robust FE model only once per FE code involved. Subsequently introduce a simulation robot that can capture hidden steps and workflow, create a script and execute that script multiple times. This simulation robot is called ISight. […]

MANIPULATE DIALOGI often hear customers designing mechanical components say something like “I have assembly design constraints and don’t think I need Kinematics.” The truth is, you may not need them – if you design items that do not move. Kinematics is the study of motion, and even with the standard CATIA V5 Assembly Design constraints, you are limited to a single movement based on a given set of constraints by holding down the right mouse button when using the compass to move an item (holding down the right mouse button respects constraints already applied) or you have the option to check the button in the Manipulate dialog With respect to constraints. 

Below is an example of what can be done with simple assembly constraints and the manipulator and where its limitations are.

 

What if you needed more than one movement to happen at the same time? That is where Kinematics will help. With CATIA V5 Kinematics, you have many, many options for setting up motion.  Each grouping of given movements would be called a mechanism, and within the mechanism you would have joints. CATIA V5 offers every kind of joint I can think of and I have yet to run across anything else I would need.

joints

The joints are groupings of your constraints that can then be controlled by commands; they are very simple to set up. The freedom to have anything move at any given time!  In fact, if you already have constraints defined in your assembly, it has a slick converter option to re-use the work you already have done and add your constraints to joints! Below is just a simple mechanism with multiple joints defined being played to show how the toy excavator product works.

 

Although this is a simple mechanism, the Kinematics package has the ability to do so much more….like analyze the travel of a particular joint and check if the limits have been reached.  If you combine the Kinematics package with the CATIA V5 Space Analysis license you will have the ability to check for clash and clearance between moving parts – which is exactly what most customers need to do! Add in a CATIA V5 DMU Navigator license and you can animate your sections – how cool is that?!

Bottom line: if you need motion and need to know how your motion affects other parts in your assembly, contact us and we will get you moving in the right direction!

 

 

There is a phrase among finite element analyst user community. Those who have been in the industry since a while must have heard of it at some point in their career.

     GARBAGE IN….GARBAGE OUT

It means that if the data being fed into the input deck is not correct or appropriate, the solver is very likely to give incorrect results, and that’s if it does not fail with errors. Many of us believe that getting some sort of result is better than getting fatal errors, which is not correct. Fatal errors give clear diagnostic messages to the user that allow him to correct the input deck. However, getting erroneous results sometimes makes a user feel that the simulation has been successful even though the results may be far from reality. Such situations are hard to predict and correct, as the underlying cause is not clearly visible.

One such situation arises when the user inadvertently chooses an element type that is not capable of capturing the actual physical behavior of the part or assembly with which the element is associated. The incompatibility may lie with respect to element material, element topology, element dimension, or the type of output associated with the element. The objective of this post is to highlight the capabilities and limitations of some lesser known element types available in the Abaqus element library to promote their proper usage.

Planar elements

These elements are further classified as either plane stress (CPS) or plane strain elements (CPE). The plane stress elements are used to model thin structures such as composite plate. These elements must be defined and can deform only in X-Y plane. For these types of elements:

szz = t xz = t yz = 0

Image1

The plane strain elements are used to model thick structures such as rubber gaskets. These types of elements must be defined and can deform only in X-Y plane. For these types of elements:

ezz = gxz = gyz = 0

Image2

Generalized plane strain elements

[…]

Our SIMULIA user community has been using the conventional analysis and portfolio tokens for a while now. These tokens are primarily used to access the Abaqus CAE pre-processor, Abaqus solver, and the Abaqus viewer. The analysis configuration offers Abaqus solver licenses in the form of tokens, and Abaqus CAE as well as Abaqus viewer as interactive seats. The portfolio configuration offers all three components of Abaqus, i.e. the solver itself, Abaqus CAE as well as Abaqus viewer as tokens.

                                                                                                                                                      IS SIMULIA = only ABAQUS!

The new equation has been EXTENDED

                                                                                                                                   SIMULIA = ABAQUS + ISIGHT + TOSCA + FESAFE

The overall simulation offerings from Dassault Systèmes go way beyond Abaqus finite element simulations. The functionalities now include process automation, parametric optimizations, topology optimization, fatigue estimation, and many more. And starting from Abaqus release 6.13-2, all these additional capabilities are included in a single licensing scheme called extended tokens. Here is an overview of these additional SIMULIA products.extended-products

ISIGHT

ISight is an open desktop solution for creating flexible simulation process flows, consisting of a variety of applications, to automate the exploration of design alternatives, identify optimal performance parameters, and integrate added-value systems. The simulation process flows created from ISight can include multiple third party simulation components such as Ansys, LS-DYNA, Nastran, Mathcad as well as general purpose components such as Matlab, excel, calculator, and many more. It offers advanced parametric optimization, Design of experiments and Six Sigma techniques. Moreover, the vast amount of Simulation output data generated by such techniques can be managed effectively using the post processing runtime gateways of ISight. It’s rightly called a Simulation Robot.

ISight-image

 

TOSCA

Tosca is a general purpose optimization solution for designing high performance light weighted structures. As fuel economy continues to be the most important design factor in the transportation and aviation industries, designing lightweighted components and assemblies will remain a top priority, and Tosca can really help to achieve those objectives. […]

As an FEA analyst, you are likely losing too much of your time in CAD repair.

If you are an experienced FEA analyst, you must have come across following types of situations often while meshing your models:

“I create 3D geometries in CAD uniting together several surfaces so that the CAD modeler itself sees one unique surface; however, whenever I export it as a .sat, .stp or even binary file for Parasolid and then import it into the FEA pre-processor, I again see all those surfaces that are not supposed to be there.”

“For some parts I am extruding surfaces to solids, and for some parts I am building solids out of intersecting surfaces. All in all, it is a kind of a box structure with a hole on one side. I started importing it to GUI part by part, and as soon as I have top and bottom plate and two sides, the meshing fails. How did you exactly resolve this meshing problem?”

The FEA user community knows that most of the user interfaces available for finite element analysis are good for FE modeling only – they are not expert CAD modelers. It often happens that the CAD model created is not free from defects from a meshing perspective. The most common problems are duplicate edges, gaps, silver surfaces, unnecessary patches, etc. The problem is often more severe if a CAD model is first translated to a neutral format such as .sat, .iges, .step files before being imported into the FEA pre-processor; the defects are generated during the translation. In many other cases, the repairs made in the CAD model are not propagated into FEA modeler. The only option left is to repair the geometry in the FEA model itself, but the repair tools required often don’t exist in these user interfaces.

One-click model transfer from CAD to FEA without any neutral file format

For Abaqus users, there is great news: the Abaqus CAE pre-processor now has associative interfaces for CATIA, ProE and SOLIDWORKS.

The CATIA V5 Associative Interface allows you to transfer CATIA V5 Parts and Products into Abaqus/CAE using associative import. Materials and publications assigned to the CATIA V5 model are also transferred to the Abaqus/CAE model as material and set definitions respectively. In addition to associative import, the CATIA V5 Associative Interface allows you to directly import the geometry of CATIA V5 models in .CATPart and .CATProduct format into Abaqus/CAE without any intermediate neutral files. The following options are available with CATIA V5 associative interface: […]

© Tata Technologies 2009-2015. All rights reserved.