In previous blog articles on 3D Experience simulation roles, we primarily discussed platform configurations, concept of personas and roles as well as simulation capacity of the platform. In this blog article contains detailed information about three primary structural simulation roles: MDS, DRD and SMU.

To begin with, lets recapitulate that simulation roles are categorized in groups based on personas of users working on such roles. In terms of complexity and functionality, offerings range from based to intermediate to advanced.

 

Engineer profile: The is the simplest and easiest to use simulation offering primarily meant for designers with low to intermediate simulation knowledge. Their primary job is product design and they perform simulations very occasionally. Roles for this profile are CAD centric and are associated with a guided workflow. Simulation tokens are embedded in the role.

Analysis engineer profile: This profile is one level above the engineer profile and is suitable for structural analysis engineers associated with product engineering. Their simulation knowledge is of intermediate level which means they understand simulation process in terms of meshing, BC, Loads, result visualization etc. but don’t have any hands-on experience of advanced simulation tools. Usually there is no guided workflow. Simulation tokens are embedded in the role.

Analyst profile: This role is for full time analysts who primarily perform intermediate to advanced level simulations. They have in depth expertise in at-least one simulation domain and often hold Masters or Doctorate level credentials. This role requires extensive knowledge of pre-processing, solver terminologies such as statics, dynamics, non-linearity, convergence schemes, as well as post processing etc. There is no guided workflow. Simulation tokens are procured separately.

 Research Specialist profile: This is a complex simulation offering primarily for experts who develop novel simulation workflows and processes. The simulation requirements often span across multiple physical domains and involves advanced Physics such as vibrations and noise. The pre-processing aspect may include complex meshing of assemblies and assemblies of meshes.

Let’s look at one role from each of first three profiles:

Stress Engineer role (MDS)

 

It’s a role from engineer profile and has a guided workflow. The snapshot shows apps available in MDS role. It performs routine strength and deflection calculations under static loading conditions. It can also compute product fatigue life for very simple loads. The CATIA and SOLIDWORKS associativity is well maintained. Local solver execution up to 4 cores is included.

Structural analysis Engineer role (DRD)

 

It’s a role from analysis engineer profile that has no guided workflow. It is used to access the structural integrity of products subjected to wide range of loading conditions. The snapshot above shows available apps in this role. It works on MSR concept available in advanced simulation tools i.e.  Model-Scenario-Results. Many advanced settings are exposed to the user. This role can perform multi step simulations. Local job execution of up to 8 cores is available.

Mechanical Analyst role (SMU)

 

It’s a role from analyst profile and it does not include a guided workflow. The snapshot above shows available apps in this role.  It uses advanced finite element techniques to simulate and validate complex engineering problems. It offers multiple advanced meshing techniques such as Octree, surface, sweep and RBM. Both single step as well as multi step scenarios are included. Supported analysis steps include static perturbation, non-linear static, frequency, buckling, implicit dynamics, explicit dynamics, steady state heat transfer, transient heat transfer etc. Most of the non-linear materials and complex engineering connections are included.

While we discussed one prominent role from each profile, the south quadrant of 3D Experience platform offers numerous simulation roles. To know more, please contact us.

Embracing a true PLM platform and solution is not an easy endeavor for many companies, even with the reckoning of the potential value and ROI offered by a rightly architected PLM solution.  Success in any Enterprise software implementation like PLM often requires careful planning, dedicated resources , right technical expertise, executive sponsorship, and a receptive culture, among other things.  When done the right way the results of such efforts are transformational, producing significant business benefit which can be measured and validated.

One of the biggest challenges to adopting PLM is organizational change management given the breadth and scale of a true PLM solution . Many companies approaches it in phases and rightly so; but the key is how the phases are architected, tracked and measured.  PLM involves managing and linking Data, Processes  and People together as the product goes through it’s lifecycle from inception to design to manufacturing to support and eventually end of life.   The first step of this is often managing Data; specifically Engineering CAD data.  Most solutions start with a way to vault the CAD data along with some basic part numbering schemes and revision rules . Sometimes engineering documents are also vaulted along with the CAD data.   Yes data  vaulted in a central repository brings  lot of benefits like elimination of duplicates , basic check-in-checkout / access controls and  added search capabilities as opposed to it scattered across multiple locations.  But the measured value of this alone may not substantiate the heavy PLM IT investment companies needs to make for a true scalable PLM platform.   Sometimes there is an expectation misalignment on the full PLM value and just the data vaulting value . This at times sends companies to a long and lull “PLM assessment” period  after data vaulting.  Sometimes cultural resistance or organizational change overturns any momentum.  Maybe a technical glitch or integration shortfall previously overlooked becomes a deal breaker . Over-scoped and under supported initiative can also run out of money or time.

Companies make a considerable amount of IT investment on the PLM platform upfront, so that they have a scalable solution for all phases and not just CAD vaulting.  Most of the time they can add more capabilities and processes on the PLM platform without additional IT investments .  So it’s very important to get past the initial data vaulting phase and move to the next phases to maximize the utilization of existing IT investments.  Now the question is where do we go after CAD vaulting. This is where upfront PLM Roadmap definition is so important in terms of  how the phases are architected, tracked and measured.  For companies who have successfully completed data vaulting but do not have a formal PLM Roadmap defined yet, some of the next focus areas to consider can be Engineering process management, BOM Management,  Change management , Requirements management , Project and Program management , in no specific order.

One thing common between SIMULIA roles of 3DExperience platform and the standalone Abaqus products is that both require an Abaqus solver to perform computations. It further means that both solutions require Abaqus tokens to complete or speed up the computation part of the simulation. For standalone abaqus product, we know that the calculation is straight forward. Abaqus requires a minimum of five tokens to execute a single core non-linear job. Large models require more number of cores to solve in real time and more number of cores require more tokens as follows:

The computation capacity of 3D Experience platform, however, cannot be defined by a single equation. Unlike Abaqus solver, that is available as an integrated all-in-one license for all types of simulations such as standard, CFD, explicit etc., 3D Experience offerings are in form of roles. Each role is a sellable license that includes either some or all Abaqus solver capabilities. Offers are made further flexible by on premise vs. on cloud offerings. Let’s have a look at solver offerings in different configurations and roles.

    Engineer role vs. Analyst role

While most of design engineer roles have embedded Abaqus tokens, most of the analyst roles do not have any compute capacity at all. The number of tokens embedded in designer role depends on the level of simulation complexity a role can accommodate. For example

Stress Engineer role has 8 embedded tokens to accommodate up-to 4 cores job

Structural analysis engineer role has 12 embedded tokens to accommodate up-to 8 cores job

It is possible to submit jobs on more number of cores than what embedded solver permits but in that situation external tokens need to be utilized and embedded solver takes no credit at all.

                Tokens vs. Credits

In case of analyst roles such as stress analyst, fluid mechanics analyst etc., the role itself does not have any compute capacity which should be procured either in the form of tokens or credits. Tokens are renewable form of compute capacity which means they can be used over and over. 3D Experience uses tokens in a very similar fashion as does standalone Abaqus. The token consumption with respect to number of cores is the same for Abaqus as for 3D Experience platform. On the contrary credits are a non-renewable form of compute power. It means that credits, just like the talk time over phone, can be consumed only once.

               Why credits at all!!!

In general credits is an expensive preposition for customer but there are exceptions. Credits are utilized to meet unexpected and rare increase in peak usage. This is somewhat more common in engineering consulting firms that can face high demand of simulation capacity due to influx of many short duration simulation projects at any time. To meet this sudden spike in demand, one-time credit bundle offering makes more sense than increase in perpetual tokens. Once peak demand is over and credits are consumed, simulation capacity is returned to normal levels.

On premise vs. on cloud

Design engineer as well as analyst roles are available in on premise as well as on cloud formats. There are three ways of utilizing cloud resources: store the models on cloud, stores the results on cloud and solve on cloud. The first two offerings require only cloud storage and are available at no additional charge with cloud based license. However, the third offering requires cloud compute resources that consumes compute credits in addition to cloud based license.

Need to know more about SIMULIA 3D Experience platform compute capacity! Please approach us and we are ready to help.

 

Does your organization struggle to produce CAD and digital definitions of product? Is the CAD development of product a bottleneck in your process? If the answer is yes, you could benefit from a Digital Engineering Benchmark.

The Digital Engineering Benchmark assessment captures the opinions of senior and knowledgeable personnel in your organization on the current state and future Digital Engineering requirements for your business. In addition, a priority for improvement and an assessment of current effectiveness is recorded. It centers on 17 key Digital Engineering “Pillars” ranging from 3D CAD Standards, through to CAD Extensions. The pillars are listed below:

  1. 3D CAD Standards
  2. Drawing Standards
  3. CAD Templates
  4. 3D Standard Features
  5. Standard Parts Library
  6. Materials Library
  7. Automated Drawing Generation
  8. 3D Master
  9. Automated Designs
  10. Automation Scripts
  11. Digital Mockup
  12. Spatial Analysis
  13. Special CAD Extensions
  14. Design for Manufacturing
  15. CAD Checking Tools
  16. Intellectual Property Protection
  17. Publications

After the 17 pillars have been covered, senior and knowledgeable personnel are also invited to “spend” an assumed benefit in value areas within your business. The areas identified are improving time to market, increasing the portfolio of the company and improving product quality.

Finally, the tool produces a comprehensive report showing the customer’s current state of maturity and a benchmark comparison with the industry.

Participants have found this process to be very useful as it allows them to prioritize their initiatives, gives a high level view of their roadmap to success and provides them with industry benchmark information

Organizations invest huge sums of money in simulation software to avoid expensive and disruptive physical testing processes. But how long it really takes to make this transformation happen! One thing is sure; it does not happen in a day. The flow chart below explains the reason pictorially. The last two blocks “compare and improve model” and “compare and improve theory” make this transformation a longer process than expected.

 

Let’s explore the reasons behind it. Comparison is needed to make sure that simulation results mimic the physical testing results before latter can be discarded, partially or fully. The difference in results can be due to three main factors: lack of user competency, limitation of software used, lack of sufficient input data.

Lack of user competency: FEA analysts are not born in a day. The subject is complex to learn and so are the software associated with it. The ramp up time really depends on analyst background along with complexity of problem being simulated. Organizations usually make a choice between hiring expert and expensive analysts who can deliver the results right away or producing analysts of its own through class room and hands on trainings. First option saves time while the second saves money. CAE software development companies are also making big stories these days by introducing CAD embedded simulation tools that require nominal user competency. Nevertheless, the competency builds up over time.

Limitation of software used: Initial investment in simulation domain is usually small. It means two things: either number of users are less or software functionality is limited. With time, complexity of problems goes up but the software remains the same. A common example I have seen is of a customer starting with simple linear simulation workbench in CATIA and over period trying to simulate finite sliding contact problems with frictional interfaces in the same workbench. Users don’t realize that their problem complexity has exceeded the software capacity to handle and it’s time to upgrade. It’s always recommended that analysts get in touch with their software vendors whenever they anticipate an increase in simulation software capacity or functionality. A certified simulation software vendor is a trusted advisor who can really help.

Lack of sufficient input data: “Garbage in – Garbage out” is a very common phrase in simulation world. However, at times it is very difficult to get the right input for software in use. The complexity of input data can arise either from complex material behavior or from complex loading conditions. Example of complex material may be hyper-elasticity or visco-elasticity observed in elastomeric materials. Examples of complex loading may be real time multi block road load data to estimate fatigue life. Sometimes simple metallic structures exhibit complex behavior due to complex loading. Examples are high speed impact or creep loading. With time many material testing labs have come into existence that can perform in house testing to provide right input data for simulation.

Conclusion: You will come out of the vicious loop of physical and simulation results comparison after couple of iterations if you have three things in place: right people, right software product and right input data. If you need help in any of the three aspects, we are always available.

Anyone who has dealt with Bill of Materials (BOM) knows about the challenges and complexities involved with it. Sometimes we get asked, managing a single BOM itself is cumbersome, then why do we even need another one in the form of a Manufacturing Bill of Material (mBOM)..?

What we have seen with our customers  is that, when there is only one BOM then it is usually owned by the engineering department (CAD BOM/ eBOM) and will be available for  the Manufacturing Department as  a “read only”. This is not good enough for the manufacturing teams as they need to author and add data specific to manufacturing , for example  manufacturing specific consumable parts like glue, oil or Tool Fixtures and such. Another key factor is how the BOM is structured; typically eBOM is structured around organization systems and functions and represent the product architecture, but for manufacturing team a mBOM needs to be  organized according to the manufacturing assembly order.

When customers need to work towards the industry 4.0 goal, they need to have  smarter manufacturing  solutions and systems that provide more ways to capture the manufacturing business intelligence and then suggest solutions based on the previous patterns. With this in mind they need to invest in  manufacturing BOM authoring and management area. During a mBOM adoption, the key is not to recreate the data that’s already in eBOM, but to reuse the eBOM and add additional information specific to manufacturing. That way there is both reuse and traceability of the data.

At a high level mBOM creation automation solutions exist in multiple flavors

  1.  Recipe based mBOM:  In a recipe based mBOM, customers can initiate the mBOM creation via pre-configured  templates pointing to eBOM. Based on the recipe stored with the template it will automatically fetch the engineering parts into mBOM. This kind of solution helps customers who have heavy standardization in their product offerings.
  2. Reusable Manufacturing Assembly: In such a solution, customers can leverage the same manufacturing assembly across multiple product lines to reduce the design, development and procurement costs
  3. New Offline Processing Solutions: This approach is to tailor the mBOM creation process and application to the customer need using customization. This standardizes and automates the process to capture the business intelligence and its reuse via customization.
  4. Smarter Validations: Such solutions suggests what’s next to the business users, that way users spends less time discovering the problem and more time solving it.

Over all value of such solutions is not just the flexibility it offers the manufacturing team, it also reduces manufacturing process planning and execution lead time with improved structure accuracy and significant reduction in change reconciliation processing time.

More often PLM starts as a CAD/Design data vault for many companies, later evolving to a design data exchange platform .  Most successful companies are taking PLM beyond just a design data exchange and access control platform; to a knowledge driven decision support system.  This means PLM not only needs to manage the multitude of information generated at various stages of the product lifecycle , but also capture the product development knowledge and feed it back to the product lifecyccle. For example, the requirements and design for a newer version of a product  needs to be also driven by the knowledge elements captured from the previous version’s lefecycle, from inception to design to manufacturing and service.

When PLM stays just in the Design Engineering world, it’s constrained to exchange information and capture knowledge from downstream stages managed by disconnected, silo based systems. This results in engineers spending huge amount of time in data acquisition tasks. Industry studies shows that information workers spend 30-40% of their time only for information gathering and analysis, thus wasting time in searching for nonexistent information, failing to find existing information, validating the information or recreating information that can’t be found.

Quality escapes is another challenge with such disconnected systems when product doesn’t confirm with the engineering definition. Non-conformances found on the shop floor  are costly to review and dispose and even more severe when the product is already on service. Reconciling change is also extremely challenging, especially its downstream propagation, resulting in significant productivity losses. Slow change processing along with quality escapes cause delays in new product introduction affecting the overall ability of the companies to compete.

The first step towards transforming PLM to a true knowledge driven decision support system is to extend it to the CAD/CAM/CNC process chain, thus taking it to the shopfloors. Such a solution helps to establish a  continuous loop from Engineering into the shop floor for operations management and manufacturing execution systems (MES). Such a continuous loop system provide more ways to capture the business intelligence and then suggest solutions based on the previous patterns. Then it’s much easier to capture information and use analytics to synthesize valuable knowledge elements compared to the fragmented solutions many companies have today.  It’s also a foundational element for establishing a Digital Twin per Industry 4.0 vision

 

Other key benefits of extending PLM to manufacturing include

Reducing the time to market

  • Enhanced collaboration between Product and Manufacturing Engineering
  • Enhanced Traceability and Faster Change Management

Enhancing Flexibility

  • Manufacturing plans comprehend product variability/complexity
  • “What if” scenarios for optimized decision making

Increasing Quality

  • Manufacturing Simulation and validation integrated in PLM
  • Up-to-date 3D work instructions delivered to the shop floor

Increasing Efficiency

  • Ongoing process optimization based on Closed loop feedback of utilization data
  • Reuse of common methods/tooling

You have probably heard about 3D EXPERIENCE. Or you may have heard about “The Platform”. But what does it actually do? How is it made? This article will explain, from the very beginning, what the 3D EXPERIENCE Platform is all about.

The Platform is created by Dassault Systemes and is their central product for the future. Here is a quote from the 3DS website:

“Dassault Systemes, the 3DEXPERIENCE Company, provides business and people with virtual universes to imagine sustainable innovations”

3D EXPERIENCE Platform is primarily a Product Lifecycle Management (PLM) system and is aimed at supporting the digital design and development of products that subsequently get manufactured. This explains the references to “virtual” and “innovations” in the quote. As a platform, it houses multiple capabilities (apps) in a single seamless piece of software. A user logs into one interface and is able to access all the installed and licensed capabilities of their 3D EXPERIENCE from there. The platform is based on a database, allowing for storage and indexing of data.

Dassault Systemes explain their platform in terms of a 3D compass illustrated below:

 

Let us look at the four quadrants, starting at the top and work around the quadrant in a clockwise direction:

  1. Collaboration apps include functionality that foster informal collaboration across extended teams (SWYM) and structured collaboration such as formal change management (ENOVIA)
  2. Information intelligence apps are designed to handle Big Data and drawing from multiple sources, present the user with concise summaries of the information they need (NETVIBES)
  3. Simulation apps are aimed at virtual digital validation and testing of designs. Traditionally this is known as CAE or FEA (SIMULIA). It can also be extended to virtual simulation of factories or retail store layouts.
  4. 3D Modeling apps are perhaps what Dassault Systemes is best known for and include CATIA and Solidworks

Finally, because all this functionality is in a single platform, this allows the user a realistic and immediate experience in a virtual world (Real time Experience).

A few more pertinent point regarding the 3D EXPERIENCE Platform:

  1. The platform is scalable to suit the requirements of each organization using the technology. Adopters can chose to start with basic functionality and then move to more advanced capabilities.
  2. The concept of a design platform grew out of the necessity to store and maintain all the digital data generated during product development after widespread adoption of CAD applications. It now has extensive capabilities beyond that.
  3. Dassault Systemes are constantly added new apps and functions to the platform, so the range of capability is expanding.
  4. More and more of the platform is moving to the cloud. Dassauly Systemes now offer a complete SAAS model for a lot of apps within the platform. This dramatically reduces implementation complexity.

Obviously, each application included in the platform is comprehensive enough to warrant a complete subject in and of itself, but I hope this breakdown has given you a useful high-level overview of what 3D EXPERIENCE can do for an organization, not just at initial implementation but in terms of adapting to its changing needs over time

 

 

Are you looking at investing in a MES (Manufacturing Execution System)? Do you need to improve the efficiency of your manufacturing operations with the latest technology? If you answered yes, then a MES benchmark may be exactly what is needed.

In order for you to realize the value from your current or future MES investments, you must first understand the maturity of your business and your current state. In addition, you must identify a pragmatic future state and plan a roadmap to achieve it. This may involve not only introducing new technologies and processes, but changes to your organization to support them.

Tata Technologies has developed a structured MES Analytics process with supporting tools and processes to help our customers understand the maturity of their MES, compare it to their peers and plan for the future.

The MES Benchmark assessment captures the opinions of senior and knowledgeable personnel in your organization on the current state and future MES requirements for your business, together with a priority for improvement and an assessment of current effectiveness. It centers on 17 key MES “Pillars” ranging from Scheduling Management, through to Shipping. These pillars are listed below:

  1. Enterprise Resource Planning (ERP) Integration
  2. Product Lifecycle Management (PLM) Integration
  3. SCADA, Control and Interfaces
  4. Inventory Management
  5. Planning, Scheduling and Execution
  6. Resource Management
  7. Progress Tracking
  8. Track / Traceability / Genealogy
  9. Error Proofing
  10. Quality Management
  11. Recipe Management
  12. Work Instructions
  13. Shipping Management
  14. Shop floor Information
  15. Data Collection and Performance Analysis
  16. Maintenance Planning and execution
  17. Predictive Analytics

After the 17 pillars have been covered, senior and knowledgeable personnel are also invited to “spend” an assumed benefit in value areas within your business. The areas identified are improving time to market, increasing the portfolio of the company and improving product quality.

Finally, the tool produces a comprehensive report showing the customers current state of maturity and a benchmark comparison with the industry.

Participants have found this process to be very useful as it allows them to prioritize their initiatives, gives a high-level view of their roadmap to success and provides them with industry benchmark information

SOLIDWORKS® MBD (Model Based Definition) is an integrated drawingless manufacturing solution for SOLIDWORKS 3D design software. With SOLIDWORKS MBD, you can communicate product and manufacturing information (PMI) directly in 3D, bypassing time-consuming 2D processes, and eliminating potential problems. Companies embracing model-based definition methodologies report savings in multiple areas, including reductions in manufacturing errors, decrease in scrap and rework costs, and lower procurement costs for purchased parts. With such an increased focus on using SOLIDWORKS MBD, at i GET IT we invested in creating a new self-paced training course to help people understand and use the features of MBD.

Our newly released SOLIDWORKS MBD and DimXpert course covers the basics of creating and communicating Product Manufacturing Information, or PMI, using SOLIDWORKS MBD and DimXpert tools. You’ll learn what SOLIDWORKS Model Based Definition is, how to attach dimensions and other annotations to a model, how to view and manage annotation views, specialized PMI techniques, how to create custom 3D PDF templates and publish them, and how view 3D PDFs and eDrawings once they’ve been created.

After completing this course, you will receive a Certificate of Completion from i GET IT and Tata Technologies. This certificate can be used to prove that you have completed training on using SOLIDWORKS MBD. The course content consists of video lessons, practice Try It exercises and in-course quizzes.

To learn more about the new SOLIDWORKS MBD and DimXpert self-paced training course (and view our entire library), visit here.

Over 90 hours of i GET IT online training courses for SOLIDWORKS are included in either the SOLIDWORKS Training, Designer Bundle and/or Professional Engineer Bundle.  Current subscribers will automatically receive access as long as the subscription its currently active. For more information on training plans, click here.

© Tata Technologies 2009-2015. All rights reserved.