Category "Simulation"

untitled

Composites always had a well-defined place in the aerospace industry because of their properties: lightweight to make overall design lighter and toughness to make overall design bear the aero structural loads. At present, from aircraft fairing to train noses, boat hulls and wind turbines, composites offer dramatic opportunities to meet increasing cost-driven market requirements and environmental concerns. However, modeling of composites in a seamless collaborative environment has always been a challenge. This is because of multiple aspects of composites modeling such as design, simulation, and manufacturing that made it quite a tough task on a single platform.

CATIA composites workbench now offers a solution to address various aspects of composites modeling in a unified manner. The objective of this blog post is to provide information on composites workbench capabilities with respect to design, simulation, and manufacturability of composites.

DESIGN IN ANALYSIS CONTEXT

There are different ways to start the preliminary design of a composite part, but the zone-based design is ideal to capture analysis constraints and predict the behavior of the part inside the design environment by importing thickness laws. The thickness laws are calculated as a result of FEA analysis. The composites part design workbench in CATIA provides easy-to-use dedicated zone creation and modification features. Zone-based modeling contributes to significant time savings with the ability to perform concurrent engineering with mating parts. The image below shows a wing panel with a grid created from ribs and spars in assembly context and thickness law for each cell mapped on the grid from a spreadsheet.

untitled

Once the grid information is ready, Composites workbench provides highly productive automatic ply generation from zone capabilities with automatic management of the ply staggering and stacking rules. The ability to quickly and automatically transition from zones to plies while keeping full associativity, allows the designer to focus on the design intent and helps dramatically reduce the number of geometrical tasks required to design the part.

untitled

To further check the viability of a design from the structural strength perspective, it is possible to perform the FEA simulation within the CATIA environment using the Elifini solver of CATIA analysis. The full associativity with composites workbench is maintained and true fiber angles are taken into account. To address the non-linear aspect of FEA, is it possible to export the plies data in the form of layup files to Abaqus CAE using the composites fiber modeler plug-in. In case design modifications are needed, it is possible to edit and modify any ply or sequence in the composites workbench and instantly export the modified layup file to simulation workbench or Abaqus CAE for validation. Thus designers and analysts can work together in collaboration during the composites development process, saving time, improving product quality, and preventing costly error. […]

The Dassault Systèmes SIMULIA portfolio releases new versions of its software products every year, and this year is no different. The first release of Abaqus 2017 is now available for download at the media download portal. SIMULIA has developed and broadcast 2017 release webinars to make users aware of new features available in the 2017 release, but those webinars are long recordings ranging from one to two hours each, which can be daunting. This blog post will provide a brief highlight of materials and explicit updates in Abaqus solver 2017. A more detailed explanation of any mentioned update, or answers to further questions, can be obtained either by listening to the webinar recordings at the SIMULIA 3DExperience user community portal, leaving a comment on this post, or contacting us.

SPH boundary conditions improvements

SPH particles located on opposite sides of a surface cannot interact with each other in the absence of boundary condition. This was not the case in previous releases; in Abaqus 2017, this is the default boundary condition setting. There are further improvements in tensile instability control to prevent instability among particles subjected to local tensile stresses. Below is an example in which there are SPH particles in two different chambers; the lower chamber particles are subjected to displacement BC while upper chamber particles are not subjected to any BC.

DEM improvements

  1. The series and parallel search algorithms for contact are unified to improve the DEM performance. The search cells are created only once.
  2. It is now possible to run DEM jobs with particle generators in parallel mode. This means more than one particle generator can be active while a DEM job is running.
  3. In previous releases, only fixed time increment scheme was available and it was difficult for the user to predict the appropriate time increment. In the 2017 release, an automatic time increment scheme has been introduced.
  4. Adhesive particle mixing is now supported. The algorithm used is called JKR adhesive inter particle contact. Both Hertz contact and friction are supported.

Material Enhancements

  1. There is some good news for users in the health care industry who design and manufacture cardiovascular stents: Super-elasticity, which was previously a part of user subroutines, is now available in the Abaqus 2017 material library. The motivation is Nitinol, a nickel titanium alloy used in cardiovascular stents because of super elasticity, shape memory effect, biocompatibility, and fatigue. The Nitinol model exhibits linear elastic Austensite behavior at lower stresses. On further loading, transformation from Austensite to Martensite occurs but behavior is still linear elastic. Beyond full transformation, Martensite exhibits elastic plastic behavior. A similar phenomenon is observed in compression loading. It is supported in Abaqus CAE.

 

 

 

 

 

 

 

 

 

 

2. A multilinear kinematic hardening model is now available in Abaqus 2017. In previous releases, this model was available as a user subroutine material called ABQ_MULTILIN_KINHARD.  Plasticity follows an array of perfectly plastic subvolumes that follow Von-Mises criteria, each with a unique yield strength. This model offers more flexibility than the linear kinematic hardening model. It is available only in Abaqus standard and intended for thermo-mechanical fatigue of metals. It is supported in Abaqus CAE.

3. The definition of damage initiation and damage evolution of cohesive elements with traction separation response has been enhanced to include rate dependent cohesive behavior. It is available only in Abaqus explicit. Non-linear damage initiation of ductile metals is now supported in Abaqus 2017. This model provides more flexibility to predict damage under arbitrary loading paths. It is available both in Abaqus standard as well as in explicit for ductile, shear and Johnson Cook material models.

 

4. Non-linear damage initiation of ductile metals is now supported in Abaqus 2017. This model provides more flexibility to predict damage under arbitrary loading paths. It is available both in Abaqus standard and explicit for ductile, shear and Johnson Cook material models.

5. The parallel rheological framework model now supports plane stress elements as well, in both standard as well as in explicit.

6. A new subroutine for user defined thermal expansion coefficients has been introduced. It is called VUEXPAN. This routine can be used in explicit to define thermal strain increments as a function of temperature, time, element number, state, or field variable. It is available only with Mises plasticity, Hill Plasticity and Johnson Cook model.

Usability Enhancements

1.Enhancements in distortion control: In Abaqus explicit, it is possible to convert highly compressed solid elements to linear kinematic formulation. Once that happens, the analysis does not stop even if the elements get inverted. It is activated by default when solid elements are used with crushable foam material.

2. Larger stable time increments in Abaqus explicit: In Abaqus 2017, there is an improved estimate method of element characteristic length to get larger stable time increments. It is defined in explicit step as follows:

*Dynamic, Explicit, improved DT method=YES (by default) or NO

It is further possible to invoke this method selectively in individual sets instead of global model as follows

*section control, improved DT method = YES or NO

 

The Dassault Systèmes SIMULIA portfolio releases new versions of its software products every year, and this year is no different. The first release of Abaqus 2017 is now available for download at the media download portal. SIMULIA has developed and broadcast 2017 release webinars to make users aware of new features available in the 2017 release, but those webinars are long recordings ranging from one to two hours each, which can be daunting. This blog post will provide a brief highlight of standard and explicit updates in the Abaqus 2017 Solver. A more detailed explanation of any mentioned update, or answers to further questions, can be obtained either by listening to the webinar recordings at the SIMULIA 3DExperience user community portal, leaving a comment on this post, or contacting us.

Updates in Abaqus Standard

Abaqus Standard 2017 has been substantially improved with respect to contact formulations. Mentioned below are the key highlights of various contact functionalities improvements.

  • Edge to surface contact has been enhanced with beams as master definition. This new approach facilitates the phenomenon of twist in beams during frictional contact.
  • Cohesive behavior in general contact.

General contact has always been useful in situations where either it becomes cumbersome to visualize and define large number of contact pairs, even by using contact wizard, or it’s not possible to predict contact interactions based on initial configuration. The general contact support now includes cohesive behavior, thereby making it possible to define contact in situations shown in figure below.Image1

 

Cohesive contact does not constrain rotational degree of freedoms. These DOFs should be constrained separately to avoid pivot ratio errors.

There have been few other changes in cohesive contact interactions. In the 2016 release, only first time cohesive contact was allowed by default, i.e. either a closed cohesive behavior at initial contact or an open initial contact that could convert to a close cohesive contact only once. In the 2017 release, only a closed initial contact could maintain a cohesive behavior by default settings. Any open contact cannot be converted to cohesive contact later. However, it is possible to change the default settings.

Image1

 

  • Linear complementary problem

A new step feature has been defined to remove some limitations of perturbation step. In earlier releases, it was not possible to define a contact in perturbation step that changes its status from open to close or vice versa. In 2017 release, an LCP type technique has been introduced in perturbation step to define frictionless, small sliding contact that could change its contact status. No other forms of non-linearity can be supported in perturbation steps.  LCP is available only for static problems. Any dynamic step is not supported.

Image1

Updates in Abaqus XFEM (crack modeling) […]

The Dassault Systèmes SIMULIA portfolio releases new versions of its software products every year, and this year is no different. The first release of Abaqus 2017 is now available for download at the media download portal. In this blog post, I provide a brief highlight of updates in Abaqus CAE 2017. A more detailed explanation of any mentioned update, or answers to further questions, can be obtained either by listening to the webinar recordings at SIMULIA 3D Experience user community portal, leaving a comment on this post, or contacting us.

  • Consistency check for missing sections

Abaqus CAE users would probably agree that this mistake happens quite often, even though the parts with defined section assignments are displayed in a separate color. In previous releases, this check was not included in data check runs, so it was not possible to detect this error unless a full run was executed. In the 2017 release, missing regions can be identified in a data check run, thus saving time by eliminating fatal error runs.

image1

 

  • New set and surface queries in query toolset

The sets and surfaces can be created at part as well as assembly level. In earlier releases, it was not possible to see the content of a set or surface in the form of text, though it was possible to visualize the content in the viewport. In the 2017 release, query toolbox includes set and surface definition options. In case of sets, information about geometry, nodes, and elements can be obtained with respect to label, type, connectivity and association with part or instance, whichever is applicable. In case of surfaces, name, type, and association with instances, constraints, or interactions can be obtained.

image1

 

  • Geometry face normal query

In the 2017 release, it is possible to visualize the normal of the face or surface by picking it in the viewport. In case of planar faces, normal is displayed instantly. In case of curved faces, CAE prompts the user to pick a point location on the face by various options.

image1

[…]

417px-the_tortoise_and_the_hare_-_project_gutenberg_etext_19994In a race, the quickest runner can never overtake the slowest, since the pursuer must first reach the point whence the pursued started, so that the slower must always hold a lead.

— Aristotle, Physics VI:9, 239b15

This paradox, as first developed by Zeno, and later retold by Aristotle, shows us that mathematical theory can be disproved by taking the hypothesis to an absurd conclusion.  To look at it another way, consider this joke:

A mathematician and scientist are trapped in a burning room.

The mathematician says “We’re doomed! First we have to cover half the distance between where we are and the door, then half the distance that remains, then half of that distance, and so on. The series is infinite.  There’ll always be some finite distance between us and the door.”

The engineer starts to run and says “Well, I figure I can get close enough for all practical purposes.”

The principle here, as it relates to simulation like FEA, is that every incremental step taken in the simulation process gets us closer to our ultimate goal of understanding the exact behavior of the model under a given set of circumstances. However, there is a limit at which we have diminishing returns and a physical prototype must be built. This evolution of simulating our designs has saved a lot of money for manufacturers who, in the past, would have had to build numerous, iterative physical prototypes. This evolution of FEA reminds me of…

2000px-mori_uncanny_valley-svgThe uncanny valley is the idea that as a human representation (robot, wax figures, animations, 3D models, etc.) increases in human likeness, the more affinity people will have towards the representation. That is, however, until a certain point.  Once this threshold is crossed, our affinity for it drops off to the point of revulsion, as in the case of zombies, or the “intermediate human-likeness” prosthetic hands.  However, as the realism continues to increase, the affinity will, in turn, start to rise.

Personally, I find this fascinating – that a trend moving through time can abruptly change direction, and then, for some strange reason, the trend reverts to its original direction. Why does this happen? There are myriad speculations as to why in the Wikipedia page that I’ll encourage the reader to peruse at leisure.

elmer-pump-heatequationBut to tie this back to FEA, think of the beginning of the Uncanny Valley curve as the start of computer assisted design simulation. The horizontal axis is time, vertical axis is accuracy.  I posit that over time, as simulating software has improved, the accuracy of our simulations has also increased. As time has gone on, the ease of use has also improved, allowing non-doctorate holders to utilize simulation as part of their design process.

And this is where we see the uncanny valley; as good as the software is, there comes a point, if you use specialized, intricate, or non-standard analysis, where the accuracy of the software falters. This tells us that there will still be needs for those PhDs, and once they get on the design and start using the software, we see the accuracy go up exponentially.

If you need help getting to the door, or navigating the valley, talk to us about our Simulation benchmark process. Leave a comment or click to contact us.

 

In the years to come, fuel efficiency and reduced emissions will be key factors in determining success within the transportation & mobility industry. Fuel economy is often directly associated with the overall weight of the vehicle. Composite materials have been widely used in the aerospace industry for many years to achieve the objectives of light weight and better performance at the same time.

The transportation & mobility industry has been following the same trends, and it is not uncommon to see the application of composites in this industry sector nowadays; however, unlike the aerospace industry, wide application of composites instead of metals is not feasible in the automotive industry. Hence, apart from material replacement, other novice methods to design and manufacture lightweight structures without compromise in performance will find greater utilization in this segment. In this blog post, I will discuss the application of TOSCA, a finite element based optimization technology.

The lightweight design optimization using virtual product development approach is a two-step process: concept design followed by improved design.

Design concept: The product development costs are mainly determined in the early concept phase. The automatic generation of optimized design proposals will reduce the number of product development cycles and the number of physical prototypes; quality is increased and development costs are significantly reduced. All you need is the definition of the maximum allowed design space – Tosca helps you to find the lightest design that fits and considers all system requirements. The technology associated with the concept design phase is called topology optimization that considers all design variables and functional constraints in optimization cycle while chasing the minimum weight objective function. The technique is iterative that often converges to a best optimal design.

HOW IT WORKS

The user starts with an initial design by defining design space, design responses, and objective function. Design space is the region from where material removal is allowed in incremental steps and objective function is often the overall weight of the component that has to be optimized. With each incremental removal of material, the performance of the component changes. Hence each increment of Tosca is followed by a finite element analysis to check existing performance against target performance. If target performance criteria is satisfied, the updated design increment is acceptable and TOSCA proceeds to the next increment. This process of incremental material removal is continued until the objective function is satisfied or no further design improvement is feasible. The image below depicts a complete CAD to CAD process flow in Tosca. The intermediate processes include TOSCA pre-processing, TOSCA and a finite element code based co-simulation and TOSCA post processing.

Tosca workflow

During the material removal process, TOSCA may be asked to perform the optimization that provides a feasible solution not only from a design perspective but from a manufacturing perspective as well. For example, TOSCA may be asked to recommend only those design variations that can be manufactured using casting and stamping processes. This is possible by defining one or more of manufacturing constraints available in TOSCA constraints library.

manufacturing constraints

While the topology optimization is applicable only on solid structures, it does not mean TOSCA cannot perform optimization on sheet metal parts. The sizing optimization module of TOSCA allows users to define thickness of sheet metal parts as design variables with a lower bound and an upper bound. […]

Predictive Engineering Analytics is a must in current product design and is required to integrate all the multi disciplinary inputs present in today’s complicated products. In the words of Siemens:

“Predictive Engineering Analytics (PEA) is the application of multi-discipline simulation and test, combined with intelligent reporting and data analytics, to develop digital twins that can predict the behavior of products across all performance attributes, throughout the product lifecycle.”

In the above quote, the concept of a digital twin is important; this is the goal of having a complete digital representation of a physical product throughout its design, manufacturing and in-service lifecycle. Such a digital twin can accurately predict all behaviors of the physical product.

There are five key ways that Simcenter(TM) helps achieve a digital twin.

  1. Simcenter(TM) has an integrated Engineering Desktop environment, which allows all pre- and post-processing operations to be carried out. This environment has best-in-class geometry editing tools, comprehensive meshing, and an ability to associate the analysis model to design data.
  2. The environment is completely extendable and can be scaled from simple to complex problems. The benefits include a common user interface and the capability to create automated routines for common simulation problems.
  3. Simcenter(TM) can be linked into other integrated products and engineering systems. This enables simulation data management and allows links to be established to test data, 1D simulations, and 3D CAD. Engineers now have confidence that the behaviors predicted by the digital twin correlate with real life.
  4. The solution produces a common environment across all engineering departments. This allows knowledge to be captured, automated, and then used by a broader team. Specific solutions and best practices become entrenched in the organization, allowing for more consistent design processes. Training requirements are also reduced.
  5. Simcenter(TM) leverages the extensive industry knowledge and capabilities of the Siemens PLM broader portfolio.

If we look at the specific functions that Simcenter(TM) can cover, here is a quote and graphic from a Siemens presentation:

picture2picture1picture4

Another unique feature of the Simcenter(TM) solution is its open platform capability. The solution can be used as the primary pre- and postprocessor for Siemens PLM Solvers, NX Nastran and LMS Samcef, or for popular third party solvers, such as Abaqus, ANSYS, LS-DYNA, and MSC Nastran. This is illustrated in another graphic from a Siemens presentation: […]

How many times has the first design iteration submitted to FEA modeling passed the design criteria?

The answer is close to zero, but even if it does happen by stroke of fortune, the design is not the optimal design – which means that although design requirements are met and validated by FEA, there is always scope of improvement either in terms of cost or in terms of performance. In general, it is not unusual to reach the optimal design in 15 to 20 iterations.

An analyst know the pain of creating a detailed finite element simulation model. Most of the steps involved, such as geometry cleaning and meshing, are very time-consuming, and they are primarily driven by geometry. Let’s look at the workflow in more detail:

An analyst in automotive industry often performs finite element modeling work in Hypermesh, stress analysis in Abaqus, optimization in Optistruct, and durability in Fe-Safe or N-code. An analyst in the aerospace industry often performs CAD composites work in CATIA, finite element modeling in Abaqus CAE, stress analysis in Abaqus or Nastran, and durability in Fe-Safe. An analyst working in other industries has his own suite of FEA tools to work with. The entire process requires data flow from one simulation code to the other. This means output from one code serves as an input to the other. Quite often this work is also done manually by the analyst.

This means that in situations where optimal design is obtained in 20 iterations as mentioned above, an analyst has to perform geometry cleaning 20 times, create FE meshes manually 20 times, and also transfer the simulation data from one piece of code to the other 20 times. By the time these design iterations are over, the analyst’s face and computer looks somewhat like this:

Let analysts remain as analysts and let simulation robot do the rest!

The traditional job of finite element analyst is to build robust high fidelity simulation models that gives correct results under real life load applications. The analyst is not an FE robot who can perform repetitive tasks with ease. In situations like one mentioned above, it makes perfect sense to let FE analyst create a robust FE model only once per FE code involved. Subsequently introduce a simulation robot that can capture hidden steps and workflow, create a script and execute that script multiple times. This simulation robot is called ISight. […]

MANIPULATE DIALOGI often hear customers designing mechanical components say something like “I have assembly design constraints and don’t think I need Kinematics.” The truth is, you may not need them – if you design items that do not move. Kinematics is the study of motion, and even with the standard CATIA V5 Assembly Design constraints, you are limited to a single movement based on a given set of constraints by holding down the right mouse button when using the compass to move an item (holding down the right mouse button respects constraints already applied) or you have the option to check the button in the Manipulate dialog With respect to constraints. 

Below is an example of what can be done with simple assembly constraints and the manipulator and where its limitations are.

 

What if you needed more than one movement to happen at the same time? That is where Kinematics will help. With CATIA V5 Kinematics, you have many, many options for setting up motion.  Each grouping of given movements would be called a mechanism, and within the mechanism you would have joints. CATIA V5 offers every kind of joint I can think of and I have yet to run across anything else I would need.

joints

The joints are groupings of your constraints that can then be controlled by commands; they are very simple to set up. The freedom to have anything move at any given time!  In fact, if you already have constraints defined in your assembly, it has a slick converter option to re-use the work you already have done and add your constraints to joints! Below is just a simple mechanism with multiple joints defined being played to show how the toy excavator product works.

 

Although this is a simple mechanism, the Kinematics package has the ability to do so much more….like analyze the travel of a particular joint and check if the limits have been reached.  If you combine the Kinematics package with the CATIA V5 Space Analysis license you will have the ability to check for clash and clearance between moving parts – which is exactly what most customers need to do! Add in a CATIA V5 DMU Navigator license and you can animate your sections – how cool is that?!

Bottom line: if you need motion and need to know how your motion affects other parts in your assembly, contact us and we will get you moving in the right direction!

 

 

There is a phrase among finite element analyst user community. Those who have been in the industry since a while must have heard of it at some point in their career.

     GARBAGE IN….GARBAGE OUT

It means that if the data being fed into the input deck is not correct or appropriate, the solver is very likely to give incorrect results, and that’s if it does not fail with errors. Many of us believe that getting some sort of result is better than getting fatal errors, which is not correct. Fatal errors give clear diagnostic messages to the user that allow him to correct the input deck. However, getting erroneous results sometimes makes a user feel that the simulation has been successful even though the results may be far from reality. Such situations are hard to predict and correct, as the underlying cause is not clearly visible.

One such situation arises when the user inadvertently chooses an element type that is not capable of capturing the actual physical behavior of the part or assembly with which the element is associated. The incompatibility may lie with respect to element material, element topology, element dimension, or the type of output associated with the element. The objective of this post is to highlight the capabilities and limitations of some lesser known element types available in the Abaqus element library to promote their proper usage.

Planar elements

These elements are further classified as either plane stress (CPS) or plane strain elements (CPE). The plane stress elements are used to model thin structures such as composite plate. These elements must be defined and can deform only in X-Y plane. For these types of elements:

szz = t xz = t yz = 0

Image1

The plane strain elements are used to model thick structures such as rubber gaskets. These types of elements must be defined and can deform only in X-Y plane. For these types of elements:

ezz = gxz = gyz = 0

Image2

Generalized plane strain elements

[…]

© Tata Technologies 2009-2015. All rights reserved.