Siemens PLM

Read articles by industry leading experts at Tata Technologies as they present information about Siemens PLM products, training, knowledge expertise and more. Sign-up below receive updates on posts by email.

In Active Workspace 3.4, Siemens PLM has made some significant improvements to search capabilities. Here are some of the highlights

Numerical Range Filters

Users can now filter and narrow down search results by entering a range of values for numerical properties in the filter panel so that they  get results only in the range of what they’re have specified. For example find bolts with a length between 60 and 100 mm.  They can also use open ended ranges by leaving the lower or upper range value blank.  This is supported for both classification & object properties in both global and in-context search using  integers and real numbers.

Pre-filter for Add Objects 

Active Workspace 3.4 allows application of property based pre-filter for in-context search.  This provides a better control of allowable choices when adding related objects with the ability to retrieve context sensitive search results via configuration.  The configuration is to set a “query type” pre-filter in the XRT definition of the “Add” command, which can be based on any property value. User can always widen the scope by deactivating the filter.

Search for Business Objects based on Form Properties

Users can now search and filter on properties of Master Forms and other Forms attached to any item revision without using compound properties . Master forms are supported OOTB, other forms (including custom forms) require adding a reference to the form storage class . The properties from forms can be configured to display as Form Name.Property or  only Property . This is used in filter panel and search string syntax. This can be also used in conjunction with dynamic compound properties (DCP) to avoid making schema changes to enable search and filters for properties on related forms.

With every release of Active Workspace, Siemens PLM keeps adding more enhanced capabilities for change management process execution, Active Workspace 3.4  is no exception.

The first enhancement is a simplified overview of change that includes most relevant information pertaining to the change.  The Consolidated change overview now includes; change description, details, participants, and originating changes.  There is a new visual status bar showing change maturity progress in overall maturity process. This progress chart helps users to understand and quickly determine the change maturity.  There is also an easily accessible change summary that shows adds, removes, and replaces. The Impacted/ Solution items added or Lineage set via Active Workspace UI  or the BOM changes done via rich client from structure manager in supersedure window are reflected in the new change summary. These easy to interpret change details help users to understand the full impact of change before they make decisions on it.

The Active Workspace 3.4 relationship browser is now improved to show all associated change objects and their relationships in the interactive relations browser.  This helps users to easily understand change objects and their hierarchical relations (Implements, Implemented by, dependencies) ,   find change objects and  other business objects  relations (Problem Items, Impacted Items, Solution Items) and also relation between items (Lineage).

These new capabilities makes Active Workspace an even more preferred user interface for change management adoption.

With every release of Active Workspace, Siemens PLM keeps adding more enhanced capabilities for Schedule Manager process execution, Active Workspace 3.4  is no exception.

The most exciting Schedule Manager process execution improvement in Active Workspace 3.4 is the ability to perform a “what-if analysis”.  What-if analysis mode enables project managers to experiment on a live schedule without impacting it . This is like working with the schedule in a “sandbox” environment to perform changes  to the tasks without committing them to the production database.  This helps project managers to determine how various schedule component changes may affect the outcome of the schedule, before actually committing the changes to the schedule. Once they are satisfied with the changes,  they can promote and commit the changes to the schedule. If they are not satisfied with the outcome of the changes, then they can choose to discard the analysis.

There are also enhancements to make the Schedule Manager tool usage easier .  Now users can change the Gantt timescale using the zoom in/out feature. They can add and remove schedule deliverables, assign multiple schedule tasks to one team member using multi-select mode, add multiple tasks quickly and easily by pinning the ‘Add schedule task’ panel and also manually launch workflow on a task. These advances in schedule authoring provide project managers and coordinators greater ease and flexibility in schedule definition and maintenance.

In this age of disruptions, product development companies need to reach across the evolving business ecosystem at a rapid pace than ever, at the same time protecting  their intellectual property . Teamcenter Active Workspace enables product development companies to

  1. Reach More people by connecting more people in more places with product data and processes; both internal and external. With a light-weight, web based user interface, they can harness the power of a changing workforce to get ahead of the competition.
  2. Reach across the business processes and leverage disruption to be a market leader. Using new tools for configuring Teamcenter, they can easily adapt to change now, and in the future.
  3. Reach greater returns by finding new ways to support the business. By reducing the burden of software support and maintenance, companies can focus on driving revenue.

User experience is the key factor when it comes to reaching more people in more places. The user experience focus for Active Workspace is to provide a clean, efficient, and simple user interface that works across multiple devices and use cases, from the basic, to the more sophisticated.  Active Workspace  UI guiding principles includes

  • Simple – A clean, efficient, and responsive layout that works in various form factors and conditions
  • Engaging – Embedded dashboard views and big picture reporting
  • Effective – Easy data and relationship visualization and creation
  • Active – Configure search results in relevance of best match, allow fast and efficient refining of the results

Active Workspace is focused on delivering content for the use cases people need to execute, making it effective to easily create, find, relate and work with data . This focus makes the experience more engaging and active for the users as opposed to a static non intelligent user interface.  Throughout the interface, companies are able to personalize the user experience to minimize training and encourage participation from key stakeholders throughout the enterprise.

The key business drivers for Active Workspace are

  • User productivity –  The expectation from today’s web users is that there should be no need for training.  Applications on the web should be easy to learn and be simple to use.  .  Active Workspace User experience design is simple enough for occasional use, yet productive and powerful enough for complex business problems making it consistent and efficient for all users and process
  • Reducing information overload  –  This is key to help make smarter and faster decisions.  Users should only see relevant and necessary information in context of what they are doing.  The UI actively guides user to what needs attention and it automates the mundane as well as provide contextually integrated tools. 
  • Reducing Cost of Ownership – Active Workspace can be easily configured, extended, and deployed with lower cost of ownership.

Many leading manufacturers pursue a global product development and manufacturing strategy. Although this allows manufacturers to achieve tremendous economy of scale and scope, this strategy has increased planning and collaboration complexities by order of magnitudes, especially in the following areas.

  • Planning global production
  • Optimizing and effectively leveraging capacity
  • Answering manufacturing feasibility questions with confidence
  • Mitigating scrap, rework and delays

When product design and manufacturing are dispersed on a global scale, how do they ensure that their teams can collaborate, perform analysis in a secured environment? Often there is a vacuum between product /process design and the actual manufacturing execution. These two teams don’t have the suitable tools to share and exchange information.

At the design stage engineers have to deal with design data, CAE models, embedded software designs, etc. At the execution stage ERP and MES systems are responsible for managing job orders, inventory, scheduling etc.  Manufacturing process management solutions allows manufacturers to  manage their enterprise product and production data on a global scale. They can integrate the product design and production execution processes in a single platform. Teamcenter Manufacturing Solutions provides an en d-to-end solution to collaboratively design, validate, optimize, and document manufacturing processes .  Key capabilities include:

  • Process design and planning

A single source of manufacturing knowledge can streamline collaborative processes and decision making across the product and manufacturing engineering departments. Teamcenter supports process design and planning by leveraging all of the product and process information for planning purposes, creating multiple plant views with process structure, scoping the process workflow and tracking BOM line items. This can reduce planning cycles and optimize production.

  • Change visibility

Teamcenter provides visibility to change. Late-stage changes can have the largest impact on a manufacturer’s bottom line as the cost of change raises exponentially throughout the product lifecycle. Teamcenter communicates change from engineering to production in controlled workflows that include bill of materials management. Teamcenter provides production updates and validates the impact to existing production processes.

  • Manufacturing work instructions

With Teamcenter, electronic work instructions are created and managed in one single source that spans the lifecycle, from Design to Manufacturing Planning to Process Instructions Planning to Execution. You can streamline workflow, and work instruction processes, including 3D visualization and simulation  to provide product context and demonstrate how to execute tasks

  • Interoperability and open architecture

Underpinning the entire manufacturing process is Teamcenter’s open PLM platform. Teamcenter brings together all engineering and manufacturing information, including bi-directional BOM-BOP integration. By using ISO-standard JT files, manufacturing workers have visibility to 3D product designs in a CAD-neutral visualization format.

In summary, the benefits of using Teamcenter for manufacturing process management are:

  • Concurrently develop product and process plans so you can make smarter decisions, earlier, and speed time to market.
  • Mitigate the risk of late-stage change, which has the largest single impact to profitability
  • Reuse proven global production capabilities to optimize quality and performance
  • Leverage Teamcenter PLM investment to streamline manufacturing planning and operations, as well as engineering

Product Excellence Program helps Siemens PLM Software to understand how customers use their products and assists them in improving the software in future releases .The Product Excellence Program is designed to protect the privacy of the user and the intellectual property created through the use of Siemens PLM Software products. It’s used to collect data about Siemens PLM Active Workspace product usage and associated Teamcenter platform installation. Data collection occurs in the background as software is used and does not affect performance or functionality,  collected data is sent to Siemens PLM Software for analysis. Per Siemens PLM no contact information is contained in the data collected not any information about data created or managed is collected. Data is solely for use by Siemens PLM Software and is never shared with third parties .

Participation in the Product Excellence Program is enabled by default during installation using either TEM or Deployment Center. System administrators can always opt out during install. Post install, participation can be controlled  with the TC_ProductExcellenceProgram site preference.  All data collection is anonymous and includes product usage; Teamcenter server platform (version, platform, architecture), client environment (browser type, version), client page visits and collected data is sent from the client browser.  

This topic has always been very popular and this problem has always been very complicated in FEA user community since the inception of Abaqus, or any non-linear FEA code in general. In this brief article, I will highlight few simulation situations where Abaqus standard may not be a good candidate from convergence perspective. Identifying these situations early during pre-processing and working in Explicit right away may save lots of time and efforts that otherwise would be wasted in trying Abaqus Standard.

  • Look at the motion aspect: We always say that simulation is not the complete replacement of physical testing right away. In the beginning physical tests play a critical role in identifying right approach for simulation as well as in data correlation between physical and virtual tests. Look closely at the physical test. Is there a large relative motion between different parts involved? If yes, then Standard is very likely to face convergence problems, even if problem is static by nature. Standard has an option of “small sliding” and “finite sliding”. But user should remember the difference between “finite sliding” and “large sliding”. Attached is the video of wire crimping simulation that ideally is a static problem but numerically not a good candidate for Standard, primarily because of motion.
  • Clock time matters: Apart from magnitude of motion, the duration of motion matters as well. While looking at physical test, closely look at the time in which motion is completed. If too much of motion is covered in too less time, problem is indeed dynamic instead of static as inertia effects cannot be ignored. In such a situation either Standard dynamics or Explicit would be the right way to go. Which one to choose really depends on event duration. If a lot of dynamic phenomenon happens in the order of milliseconds or microseconds, Explicit is only option for this candidate.
  • Is there a severe discontinuous contact: In the status file of Abaqus Standard, there is an undesirable column called SDI’s. It’s called severe discontinuous iterations and too many of these often always leads to convergence nightmare. The reason of SDI’s is discontinuous contact, also known as “chattering”. It’s a phenomenon in which nodes between two bodies in contact continuously change their contact status from OPEN to CLOSE from one iteration to the other as analysis proceeds. If chattering occurs due to modeling errors, it can be corrected but at times discontinuous contact is the nature of problem itself. In such a situation, explicit is the only approach to be taken, even for long duration events with respect to physical time. The attached video is an example of a dynamic event that would only solve in explicit or multi body dynamics, primarily because of severe discontinuous contact.
  • Is there too much Plasticity: Abaqus has material models to capture plasticity but there is a limit on the magnitude of Plasticity Abaqus Standard can handle. If the permanent deformation becomes so high that underlying part completely loses its load carrying capacity then Newton Raphson method of Abaqus Standard would not be able to establish equilibrium and further leading to non-convergence. Ideally, there is no further need to perform simulation as it’s a classic situation of part failure but if further simulation is needed, it should be continued in Explicit using Restart options.

My last blog focused on the need for a Manufacturing BOM (mBOM). When organizations start to embrace the value of mBOM and  decide to invest on solutions to manage a mBOM, the first question is where to master it , PLM or ERP ?

The answer to that question varies depending on the maturity level of PLM and ERP adoption and penetration in the organization .  If both PLM & ERP are at the same or similar maturity level, then there are many good reasons to author & manage mBOM in a PLM system and to make ERP a consumer of the mBOM mastered in PLM.

First, in PLM mBOM is integrated with the eBOM and design process . eBOM integration and reuse enables front loading, and helps manufacturing team to lower cost of mBOM authoring and management and shorten time to market.  Manufacturing users can also leverage the 3D visualization data in mBOM for better decisions and  better quality. With the master model approach being adopted by leading organizations, there is lot of Product Manufacturing Information (PMI) on the 3D Master Model, which can be leveraged in both mBOM and downstream process planning.  mBOM can also act as the starting point for detailed process planning to create the Bill of Process (BOP) inside PLM . BOP or Routing can also leverage the 3D visualization data to produce visual work instructions , which will always remain updated with the upstream design changes. The process plans can  also be simulated and validated (feasibility, human ergo, collision etc) before actual execution.  The validated Routing then get sent to Manufacturing Execution Systems (MES) along with the visual work instructions. That way there  is full traceabilty from CAD to eBOM to mBOM to BOP and eventually to MES.

The traceability enables users to run where used queries among all products and plants during a change process. This ensures all product changes are evaluated for impacts in both engineering and manufacturing contexts.

 

Embracing a true PLM platform and solution is not an easy endeavor for many companies, even with the reckoning of the potential value and ROI offered by a rightly architected PLM solution.  Success in any Enterprise software implementation like PLM often requires careful planning, dedicated resources , right technical expertise, executive sponsorship, and a receptive culture, among other things.  When done the right way the results of such efforts are transformational, producing significant business benefit which can be measured and validated.

One of the biggest challenges to adopting PLM is organizational change management given the breadth and scale of a true PLM solution . Many companies approaches it in phases and rightly so; but the key is how the phases are architected, tracked and measured.  PLM involves managing and linking Data, Processes  and People together as the product goes through it’s lifecycle from inception to design to manufacturing to support and eventually end of life.   The first step of this is often managing Data; specifically Engineering CAD data.  Most solutions start with a way to vault the CAD data along with some basic part numbering schemes and revision rules . Sometimes engineering documents are also vaulted along with the CAD data.   Yes data  vaulted in a central repository brings  lot of benefits like elimination of duplicates , basic check-in-checkout / access controls and  added search capabilities as opposed to it scattered across multiple locations.  But the measured value of this alone may not substantiate the heavy PLM IT investment companies needs to make for a true scalable PLM platform.   Sometimes there is an expectation misalignment on the full PLM value and just the data vaulting value . This at times sends companies to a long and lull “PLM assessment” period  after data vaulting.  Sometimes cultural resistance or organizational change overturns any momentum.  Maybe a technical glitch or integration shortfall previously overlooked becomes a deal breaker . Over-scoped and under supported initiative can also run out of money or time.

Companies make a considerable amount of IT investment on the PLM platform upfront, so that they have a scalable solution for all phases and not just CAD vaulting.  Most of the time they can add more capabilities and processes on the PLM platform without additional IT investments .  So it’s very important to get past the initial data vaulting phase and move to the next phases to maximize the utilization of existing IT investments.  Now the question is where do we go after CAD vaulting. This is where upfront PLM Roadmap definition is so important in terms of  how the phases are architected, tracked and measured.  For companies who have successfully completed data vaulting but do not have a formal PLM Roadmap defined yet, some of the next focus areas to consider can be Engineering process management, BOM Management,  Change management , Requirements management , Project and Program management , in no specific order.

Organizations invest huge sums of money in simulation software to avoid expensive and disruptive physical testing processes. But how long it really takes to make this transformation happen! One thing is sure; it does not happen in a day. The flow chart below explains the reason pictorially. The last two blocks “compare and improve model” and “compare and improve theory” make this transformation a longer process than expected.

 

Let’s explore the reasons behind it. Comparison is needed to make sure that simulation results mimic the physical testing results before latter can be discarded, partially or fully. The difference in results can be due to three main factors: lack of user competency, limitation of software used, lack of sufficient input data.

Lack of user competency: FEA analysts are not born in a day. The subject is complex to learn and so are the software associated with it. The ramp up time really depends on analyst background along with complexity of problem being simulated. Organizations usually make a choice between hiring expert and expensive analysts who can deliver the results right away or producing analysts of its own through class room and hands on trainings. First option saves time while the second saves money. CAE software development companies are also making big stories these days by introducing CAD embedded simulation tools that require nominal user competency. Nevertheless, the competency builds up over time.

Limitation of software used: Initial investment in simulation domain is usually small. It means two things: either number of users are less or software functionality is limited. With time, complexity of problems goes up but the software remains the same. A common example I have seen is of a customer starting with simple linear simulation workbench in CATIA and over period trying to simulate finite sliding contact problems with frictional interfaces in the same workbench. Users don’t realize that their problem complexity has exceeded the software capacity to handle and it’s time to upgrade. It’s always recommended that analysts get in touch with their software vendors whenever they anticipate an increase in simulation software capacity or functionality. A certified simulation software vendor is a trusted advisor who can really help.

Lack of sufficient input data: “Garbage in – Garbage out” is a very common phrase in simulation world. However, at times it is very difficult to get the right input for software in use. The complexity of input data can arise either from complex material behavior or from complex loading conditions. Example of complex material may be hyper-elasticity or visco-elasticity observed in elastomeric materials. Examples of complex loading may be real time multi block road load data to estimate fatigue life. Sometimes simple metallic structures exhibit complex behavior due to complex loading. Examples are high speed impact or creep loading. With time many material testing labs have come into existence that can perform in house testing to provide right input data for simulation.

Conclusion: You will come out of the vicious loop of physical and simulation results comparison after couple of iterations if you have three things in place: right people, right software product and right input data. If you need help in any of the three aspects, we are always available.

© Tata Technologies 2009-2015. All rights reserved.