Category "Siemens PLM"

Embracing a true PLM platform and solution is not an easy endeavor for many companies, even with the reckoning of the potential value and ROI offered by a rightly architected PLM solution.  Success in any Enterprise software implementation like PLM often requires careful planning, dedicated resources , right technical expertise, executive sponsorship, and a receptive culture, among other things.  When done the right way the results of such efforts are transformational, producing significant business benefit which can be measured and validated.

One of the biggest challenges to adopting PLM is organizational change management given the breadth and scale of a true PLM solution . Many companies approaches it in phases and rightly so; but the key is how the phases are architected, tracked and measured.  PLM involves managing and linking Data, Processes  and People together as the product goes through it’s lifecycle from inception to design to manufacturing to support and eventually end of life.   The first step of this is often managing Data; specifically Engineering CAD data.  Most solutions start with a way to vault the CAD data along with some basic part numbering schemes and revision rules . Sometimes engineering documents are also vaulted along with the CAD data.   Yes data  vaulted in a central repository brings  lot of benefits like elimination of duplicates , basic check-in-checkout / access controls and  added search capabilities as opposed to it scattered across multiple locations.  But the measured value of this alone may not substantiate the heavy PLM IT investment companies needs to make for a true scalable PLM platform.   Sometimes there is an expectation misalignment on the full PLM value and just the data vaulting value . This at times sends companies to a long and lull “PLM assessment” period  after data vaulting.  Sometimes cultural resistance or organizational change overturns any momentum.  Maybe a technical glitch or integration shortfall previously overlooked becomes a deal breaker . Over-scoped and under supported initiative can also run out of money or time.

Companies make a considerable amount of IT investment on the PLM platform upfront, so that they have a scalable solution for all phases and not just CAD vaulting.  Most of the time they can add more capabilities and processes on the PLM platform without additional IT investments .  So it’s very important to get past the initial data vaulting phase and move to the next phases to maximize the utilization of existing IT investments.  Now the question is where do we go after CAD vaulting. This is where upfront PLM Roadmap definition is so important in terms of  how the phases are architected, tracked and measured.  For companies who have successfully completed data vaulting but do not have a formal PLM Roadmap defined yet, some of the next focus areas to consider can be Engineering process management, BOM Management,  Change management , Requirements management , Project and Program management , in no specific order.

Organizations invest huge sums of money in simulation software to avoid expensive and disruptive physical testing processes. But how long it really takes to make this transformation happen! One thing is sure; it does not happen in a day. The flow chart below explains the reason pictorially. The last two blocks “compare and improve model” and “compare and improve theory” make this transformation a longer process than expected.

 

Let’s explore the reasons behind it. Comparison is needed to make sure that simulation results mimic the physical testing results before latter can be discarded, partially or fully. The difference in results can be due to three main factors: lack of user competency, limitation of software used, lack of sufficient input data.

Lack of user competency: FEA analysts are not born in a day. The subject is complex to learn and so are the software associated with it. The ramp up time really depends on analyst background along with complexity of problem being simulated. Organizations usually make a choice between hiring expert and expensive analysts who can deliver the results right away or producing analysts of its own through class room and hands on trainings. First option saves time while the second saves money. CAE software development companies are also making big stories these days by introducing CAD embedded simulation tools that require nominal user competency. Nevertheless, the competency builds up over time.

Limitation of software used: Initial investment in simulation domain is usually small. It means two things: either number of users are less or software functionality is limited. With time, complexity of problems goes up but the software remains the same. A common example I have seen is of a customer starting with simple linear simulation workbench in CATIA and over period trying to simulate finite sliding contact problems with frictional interfaces in the same workbench. Users don’t realize that their problem complexity has exceeded the software capacity to handle and it’s time to upgrade. It’s always recommended that analysts get in touch with their software vendors whenever they anticipate an increase in simulation software capacity or functionality. A certified simulation software vendor is a trusted advisor who can really help.

Lack of sufficient input data: “Garbage in – Garbage out” is a very common phrase in simulation world. However, at times it is very difficult to get the right input for software in use. The complexity of input data can arise either from complex material behavior or from complex loading conditions. Example of complex material may be hyper-elasticity or visco-elasticity observed in elastomeric materials. Examples of complex loading may be real time multi block road load data to estimate fatigue life. Sometimes simple metallic structures exhibit complex behavior due to complex loading. Examples are high speed impact or creep loading. With time many material testing labs have come into existence that can perform in house testing to provide right input data for simulation.

Conclusion: You will come out of the vicious loop of physical and simulation results comparison after couple of iterations if you have three things in place: right people, right software product and right input data. If you need help in any of the three aspects, we are always available.

Anyone who has dealt with Bill of Materials (BOM) knows about the challenges and complexities involved with it. Sometimes we get asked, managing a single BOM itself is cumbersome, then why do we even need another one in the form of a Manufacturing Bill of Material (mBOM)..?

What we have seen with our customers  is that, when there is only one BOM then it is usually owned by the engineering department (CAD BOM/ eBOM) and will be available for  the Manufacturing Department as  a “read only”. This is not good enough for the manufacturing teams as they need to author and add data specific to manufacturing , for example  manufacturing specific consumable parts like glue, oil or Tool Fixtures and such. Another key factor is how the BOM is structured; typically eBOM is structured around organization systems and functions and represent the product architecture, but for manufacturing team a mBOM needs to be  organized according to the manufacturing assembly order.

When customers need to work towards the industry 4.0 goal, they need to have  smarter manufacturing  solutions and systems that provide more ways to capture the manufacturing business intelligence and then suggest solutions based on the previous patterns. With this in mind they need to invest in  manufacturing BOM authoring and management area. During a mBOM adoption, the key is not to recreate the data that’s already in eBOM, but to reuse the eBOM and add additional information specific to manufacturing. That way there is both reuse and traceability of the data.

At a high level mBOM creation automation solutions exist in multiple flavors

  1.  Recipe based mBOM:  In a recipe based mBOM, customers can initiate the mBOM creation via pre-configured  templates pointing to eBOM. Based on the recipe stored with the template it will automatically fetch the engineering parts into mBOM. This kind of solution helps customers who have heavy standardization in their product offerings.
  2. Reusable Manufacturing Assembly: In such a solution, customers can leverage the same manufacturing assembly across multiple product lines to reduce the design, development and procurement costs
  3. New Offline Processing Solutions: This approach is to tailor the mBOM creation process and application to the customer need using customization. This standardizes and automates the process to capture the business intelligence and its reuse via customization.
  4. Smarter Validations: Such solutions suggests what’s next to the business users, that way users spends less time discovering the problem and more time solving it.

Over all value of such solutions is not just the flexibility it offers the manufacturing team, it also reduces manufacturing process planning and execution lead time with improved structure accuracy and significant reduction in change reconciliation processing time.

More often PLM starts as a CAD/Design data vault for many companies, later evolving to a design data exchange platform .  Most successful companies are taking PLM beyond just a design data exchange and access control platform; to a knowledge driven decision support system.  This means PLM not only needs to manage the multitude of information generated at various stages of the product lifecycle , but also capture the product development knowledge and feed it back to the product lifecyccle. For example, the requirements and design for a newer version of a product  needs to be also driven by the knowledge elements captured from the previous version’s lefecycle, from inception to design to manufacturing and service.

When PLM stays just in the Design Engineering world, it’s constrained to exchange information and capture knowledge from downstream stages managed by disconnected, silo based systems. This results in engineers spending huge amount of time in data acquisition tasks. Industry studies shows that information workers spend 30-40% of their time only for information gathering and analysis, thus wasting time in searching for nonexistent information, failing to find existing information, validating the information or recreating information that can’t be found.

Quality escapes is another challenge with such disconnected systems when product doesn’t confirm with the engineering definition. Non-conformances found on the shop floor  are costly to review and dispose and even more severe when the product is already on service. Reconciling change is also extremely challenging, especially its downstream propagation, resulting in significant productivity losses. Slow change processing along with quality escapes cause delays in new product introduction affecting the overall ability of the companies to compete.

The first step towards transforming PLM to a true knowledge driven decision support system is to extend it to the CAD/CAM/CNC process chain, thus taking it to the shopfloors. Such a solution helps to establish a  continuous loop from Engineering into the shop floor for operations management and manufacturing execution systems (MES). Such a continuous loop system provide more ways to capture the business intelligence and then suggest solutions based on the previous patterns. Then it’s much easier to capture information and use analytics to synthesize valuable knowledge elements compared to the fragmented solutions many companies have today.  It’s also a foundational element for establishing a Digital Twin per Industry 4.0 vision

 

Other key benefits of extending PLM to manufacturing include

Reducing the time to market

  • Enhanced collaboration between Product and Manufacturing Engineering
  • Enhanced Traceability and Faster Change Management

Enhancing Flexibility

  • Manufacturing plans comprehend product variability/complexity
  • “What if” scenarios for optimized decision making

Increasing Quality

  • Manufacturing Simulation and validation integrated in PLM
  • Up-to-date 3D work instructions delivered to the shop floor

Increasing Efficiency

  • Ongoing process optimization based on Closed loop feedback of utilization data
  • Reuse of common methods/tooling

“What you buy makes a difference but from whom you buy makes a bigger difference”

Most often, I talk about greatness of our product offerings in my blog articles. Such kind of blogs assist prospective customers in choosing the right product. But the same product can be procured in multiple ways, either directly from the developer or through a value-added reseller also called as VAR. In this blog article, I would emphasize on how prospective customer should select the right VAR while purchasing a Dassault Systemes or Siemens simulation product.

The first thing a customer needs to verify is whether VAR is supplying just the product or the complete solution. The difference between the two is the “value added services” associated with product usage.

Without value added services, it’s not possible for a reseller to become a value-added reseller.” Please identify if you are doing business with just a reseller or a value-added reseller. Remember, simulation tools are not easy to use. There is a learning curve associated with these tools that can greatly impact the ROI and break-even timeline. The productivity of the user can be substantially enhanced if he is associated with a reseller who can provide whole bunch of services to shorten the learning curve and achieve break-even faster. Now let’s look at what type of services makes a difference in simulation space.

We are talking about software sales as well as consulting, training and support. Our software partners, Dassault Systemes, Siemens and Autodesk offer a bunch of certifications around these four components to distinguish between just “resellers” and “value added resellers.” Being certified means reseller has enough resources and knowledge to execute a given task of sales or service. Let’s talk about each component with respect to Simulation:

Software: To sell any DS SIMULIA product, the associated VAR should have “SIMULIA V6 design sight” certification as a minimum. There are further brand certifications available such as Mid-Market Articulate for product highlight and Mid-Market Demonstrate for product technical demonstration. To sell FEMAP product from Siemens, the VAR must have “FEMAP technical certification” as a minimum. All these certifications are associated with timed examinations.

Training: Training should be an integral part of simulation software sales. It gives users enough knowledge to use the software product in production environment. To offer technical training on any SIMULIA product, the VAR should have “finite element analysis with Abaqus specialist” certification as a minimum.

Support: Once users are in production environment, technical support is required on continuous basis. While many answers related to product usage are in documentation, it’s not a full source of information. Many queries are model specific that require attention of a dedicated support engineer. To offer technical support on any SIMULIA product, the VAR should have at-least one engineer who has “SIMULIA technical support specialist” certification.  This certification should be renewed every two years. It is associated with a lengthy and “hard to pass” support certification examination across all products of SIMULIA brand.

Consulting: Consulting service plays a big role when customer either does not have enough time or resources to execute projects in house in-spite of having software product. It happens during certain burst phases of demand. While there are no certification criteria for VAR’s related to consulting in simulation space, a dedicated consulting and delivery team is needed to offer the service when demand arises.

The above information should help you in ranking your VAR. Do you need to know our rank? Please contact us.

 

Multi-select for Project Security

Active Workspace 3.3 provides the capability to apply project security to multiple objects simultaneously in Teamcenter version 10.1.7 and 11.2.3, and higher .

This video showcases the new capability in detail.  Click here

Highlights include

  • Assign multiple objects to one or more projects
  • Remove multiple objects from one or more projects
  • Remove objects from projects that are common to all selections
  • Honor project membership and access while making assignments

Structure support 

The second new feature is the ability to assign content of a structure to a project.  While viewing structure content, users can assign the elements in the structure to projects.  Users  can multiply select elements, to assign them, or use the option to assign all of the content in a structure, or just to some level.  Users can also assign the specifically referenced revisions, or to all revisions, so that as the structure content revises, it is also assigned to the project by default.

This video showcases the new capability in detail.  Click here

Highlights include

  • Assign projects to content while working within the context of a structure
  • Assign projects to entire structure or up to a specific level of the structure
  • Optionally apply project security to the revision or all revisions
  • Multi-select to assign projects

Effectivity Authoring

With Active Workspace 3.3, users can assign existing effectivity criteria to elements of the structure to indicate when those elements are applicable.  Users can also define new effectivity criteria using dates or units.  For example, this element is effective for this date range or for this range of production units.

Users can also name the effective ranges, to enable sharing, or reuse, of that same range when applying effectivity to other elements in the structure.

This video showcases the new capability in detail.  Click here

Highlights include

Assign existing effectivity criteria to qualify what structured content will be configured  (Teamcenter 10.1.7 and 11.2.3 and upwards)

  • Search and filter for existing effectivities to apply
  • Apply effectivity to revision status
  • Apply effectivity to occurrences in structure

Define new effectivity configuration criteria (Planned for future release, Teamcenter 11.2.3 and upwards)

  • Set units or dates to specify or edit effectivity
  • Apply specified effectivity to occurrences
  • Optionally share named effectivity to apply to other content in structure or other structures

Baseline

Another complete the thought capability in the area of structures is creating a baseline.  Baselines are used to capture a view of that structure at a point in time.  Siemens chose to make this work in the background, asynchronously so that  users can continue to work in the client as the server generates the baseline.  When it completes, the Active Workspace notification center is used to alert users that the baseline has been created.  By default, the process applies a release status of baseline, but that is configurable.

This video showcases the new capability in detail.  Click here

While the example shows a requirement structure, baselining works with any type of structure. Highlights include

  • Executes asynchronously to allow the user to continue other work
  • Notification sent on completion – click notification to open the baseline
  • Applies a release status of “Baseline” by default, but is configurable
  • Creates a precise baseline
  • Works with any structure content, e.g. parts, designs, and requirements

Show all Results from Find in Context

Lastly in the area of completing a thought is a visualization related topic.  In previous releases of Active Workspace, the show only results in the viewer would only work for the results that had been loaded to the client.  Users no longer have to scroll through all of the results to load them in the client before selecting the show only results in the viewer.

This video showcases the new capability in detail.  Click here

 

Universal Viewer

One of the most exciting user productivity improvements in Active Workspace 3.3, is the new universal viewer.  It enables viewing and paging through multiple file attachments.  In prior releases, only one file could be viewed.  You could not easily view other file attachments. Siemens also enabled support to view additional types of files including image files, text files, and html files.  This viewer supports markup for many of those types as well.

This video showcases the new Universal Viewer capabilities in detail : Click here

Tab Overflow Direct Access

Previous versions of Active Workspace used a carousel approach and required multiple clicks to navigate to tabs that were hidden.  The new approach allows for direct access to any of the hidden tabs. Highlights include

  • Eliminates multiple clicks to access some tabs compared with prior carousel interaction
  • Dropdown allows direct access to any of multiple tabs that not shown
  • Preserves the order of tabs
  • Replaces last tab with newly selected tab

This video shows how the new tab overflow access works: Click here

Command Stack for Visual Analysis

Siemens introduced command stacks in Active Workspace 3.2.  This is an example of their usage in 3.3 to improve access to the 3D viewer’s analytics capabilities.  Instead of having to navigate tabs, users can now directly access any of the features using the command stack.  Highlights include

  • Directly access measure, query, section, and volume and proximity search commands
  • Administrators can configure alternative arrangements and visibility of commands for specific roles, e.g., commands can be unstacked or hidden for specific roles

This video shows how the command stack works for the viewer’s analytics capabilities: Click here

Drag and Drop in Structured Content

Active Workspace 3.2 supported cut/copy/paste to edit structures, including working across multiple browsers and across multiple structures. Active Workspace 3.3 builds on that capability to improve user productivity by enabling drag and drop for many cases as described below

Edit structures efficiently using drag and drop

  • Drag and drop between unstructured lists such as folders, search results, & favorites and structures
  • Drag within one window or across multiple windows
  • Drop action active only when dragged object is valid to be dropped on the target object

Predictable results based on context

  • Drag and drop between structures to copy content
  • Drag and drop content within a structure to move
  • Drag and drop different types of objects/elements to create relations – e.g. dropping a requirement on a part creates a tracelink

This video shows how drag and drop in structures works: Click here

Some of the other improvements include

  • Icons in the object header make it easy for users to clearly understand what object is open. For objects with thumbnails the thumbnail is displayed with the type icon overlaid
  • Newly created items show up in at the top of the list to ensure that they are immediately visible and easily accessed.Object is automatically selected in single create mode
  • Easily paste on a folder or in its contents .Select a target folder and user paste command from the command bar, Use paste command on table header to directly paste content into the table

 

Siemens PLM has introduced lots of new functionality and improvements in the  latest version of Active Workspace 3.3 , the key themes being

  1. User Productivity Improvements
  2. Reduce Information Overload
  3. Configure, Extend, and Deploy
  4. Process Execution and Other Application and Industry Template Exposure

The user productivity improvements are breakdown into three categories.

  1. Improved user efficiency

First focus area for user productivity is  improved user efficiency and proficiency, which is achieved through the use of accelerators such as drag and drop and multiple select to do bulk actions. Some key capabilities are

  • Universal viewer
  • Tab overflow
  • Command stack for analysis
  • Copy and paste hyperlink improvements
  • Drag and Drop Editing structure editing
  1. Enable “Completing a thought” with a single client

Second focus area for user productivity is to enable users to complete a thought with a single client.  Users are enabled to execute complete use cases with just the Active Workspace UI or with a native authoring application and Active Workspace hosted within it.  In the latest version core features and capabilities are extended for targeted use cases. Some key ones are

  • Manage Security in Single Level Projects Hierarchy – multi-select for project security
  • Achieve secure collaboration by applying project security to configured structure content
  • Effectively manage granular access to data in larger programs through hierarchical project level security
  • Assign existing effectivity criteria to qualify what structured content will be configured
  • Define new effectivity configuration criteria
  • Create a baseline of a structure to capture a view of that structure at a point in time
  • Enable showing only the results from a find in context to easily visualize them
  1. Responsive performance

Third focus area for improved user productivity is to make the client perform and respond as fast as possible to user gestures.  In the latest version server calls are minimized to reduce latency sensitivity.  Things like long running reports are run in the background to free up the client and to allow the user to do other work. Some key improvements are

  • Minimize bandwidth and memory usage through virtual paging and streaming of content
  • Minimize server communications and sensitivity to high latencies
  • Efficient execution through journaling, analysis, and tuning

I will introduce the new user productivity improvement features to you in detail through the subsequent blogs

 

With Teamcenter Active Workspace, Siemens PLM purposely chose to focus on specific use case/role support versus just duplicating every functionality of the Teamcenter Rich client.  The initial emphasis has been to provide a zero install client to the broader, and often less frequent users, in the enterprise.  These users require a zero install client that is easy to learn.

With every release of Active Workspace, Siemens PLM continues to broaden the use cases and roles supported in it.  The graphic from left to right shows the usecases/roles already delivered with complete use case support to the ones which are under the works to enable richer application exposure for authoring capabilities. Siemens has also exposed some administration capabilities in Active Workspace such as for user management and a new XRT editor, right inside of the Active Workspace user interface.  Again all with no client install.

Active Workspace User Experience
It’s all about the content .  Active Workspace shifts the focus from the Application to the Content – the User’s data is the most important thing.  The User Interface (UI) is simple, clean, light, and fast. Subdued colors let the user’s creation be the star of the show.

There is a simple top-down, left-to-right flow of information: Who I am and my role is first .What I’m working on is clear and obvious . Data brings with it the right capability for the context – Viewer, Where Used, Attachments, History etc.  One need not know how to open tools – just read the tabs to figure out what’s available. Each tab of content brings the right capability

This part has 3D content and so it has a viewer tab. That tab brings the right viewing commands to work with it. The user focuses on “What” he needs to work on, not the “Tools” to do work. Commands and tabs are smart – they don’t appear when they don’t work or don’t have content. This eliminates the visual clutter .

Active Workspace Framework
The Active Workspace Framework enables consistency and efficiency, both for the end user and the developer. It has established patterns that control where content and features go in the UI. Common elements and modules keep the UI consistent and simplify development. Users learn interaction patterns and see them behave consistently in new areas. 

The display is data driven – what you open to work on controls what information is presented. A jet engine has a 3D Viewer and Trace Links, but a Shampoo bottle has Trade Items and Vendors. The underlying data may be technically the same, but is always presented in terms appropriate for that industry, data, and even the user.

Any complete FEA solution has at-least three mandatory components: Pre-Processor, solver and post-processor. If you compare it with an automobile, solver is the engine that has all the steps/solution sequences to solve the discretized model. It can be regarded as the main power source of a CAE system. The pre-processor is a graphical user interface that allows user to define all the inputs into the model such as geometry, material, loads and boundary scenarios etc. In our automobile analogy, pre-processor can be regarded as the ignition key without which it is not possible to utilize the engine (solver) efficiently. The post-processor is a visualization tool to make certain conclusion from requested output: either text or binary. A good CAE workflow is regarded as one that offers closed loop CAD to CAD data transfer.

The above workflow is not closed so there is no scope of model update. Any changes in design requires all the rework. This has been the traditional workflow in organizations that have completely disconnected design and analysis departments. Designers send the CAD data to analysts who perform FEA in specialized tools and submit the product virtual performance report back to designers. If a change is mandatory, FEA is performed manually all over again. Let’s look at a better workflow.

In this workflow, if the initial design does not meet the design requirements, it is updated and sent to the solver, not to the pre-processor. It means that all the pre-processing steps are mapped from old design to new design without any manual intervention. This is an effort to bridge the gap between design and analysis departments that has been embraced by the industry so far. The extent to which the GAP can be bridged depends on the chosen workflow but to some extent, almost every CAE company has taken an initiative to introduce products that bridge this GAP. Let’s discuss in context of Dassault Systemes and Siemens.

Dassault Systemes: After acquiring Abaqus Inc in 2005, Dassault Systemes rebranded it as SIMULIA with the objective of giving users access to simulation capabilities without requiring the steep learning curve of disparate, traditional simulation tools. They have been introducing new tools to meet this objective.

  • The first one in series was Associative interfaces for CATIA, Pro-E and Solidworks which is a plug-in to Abaqus CAE. With this plug-in it is possible to automatically transfer the updated data from above mentioned CAD platforms to Abaqus CAE with a single click. All the CAE parameters in Abaqus CAE are mapped from old design to updated design. It’s a nice way to reduce re-work but design and simulation teams are still separate in this workflow.
  • Next initiative was SIMULIA V5 in which Abaqus was introduced in CATIA V5 as a separate workbench. This workbench includes additional toolbars to define Abaqus model and generate Abaqus input file from within CATIA. Introduce Knowledge ware, and user has all the nice features to perform DOE’s and parametric studies. This approach brings designers and analysts with CATIA experience under one roof.
  • Next Dassault Systemes introduced SIMULIA on 3D Experience platform allowing analysts to utilize data management, process management and collaboration tools with Abaqus in the form of simulation apps and roles. The solution is now in a mature stage with incorporation of process optimization, light weight optimization, durability and advanced CFD tools. By merging SIMULIA with BIOVIA we are also talking about multi scale simulation from system to molecular level. It is further possible to perform the simulation and store the data on public or private cloud.

Siemens PLM solutions: Siemens traditional CAE tools include FEMAP user interface and NX Nastran solver. Both have been specialized tools primarily meant for analysts with little or no connectivity to CAD. More specialized and domain specific tools were added with the acquisition of LMS and Mentor Graphics.

  • In 2016 Siemens introduced its new Simulation solutions portfolio called as Simcenter that includes all Siemens simulation capabilities that can be integrated with NX environment. The popular pre-processor in Simcenter series is NX CAE that has bi-directional associativity with NX CAD. Though meant for specialists, NX CAE offers a closed loop workflow between NX CAD and NX Nastran thus making easier to evaluate re-designs and perform DOE’s.
  • Siemens also offers NX CAE add-on environments for Abaqus and Ansys thereby allowing analysis to efficiently incorporate these solvers in their NX design environment.
  • It is further possible to use Simcenter solutions with Siemens well known PLM solution Teamcenter for enterprise wide deployment of Siemens simulation tools.

This shift in approach is not limited to Dassault Systemes and Siemens. Every organization in this space be it Ansys, Autodesk or Altair are introducing such closed form solutions. One reason may be the recent acquisition of many CAE companies by bigger organizations such as Dassault, Siemens and Autodesk. Nevertheless, the change has been triggered and it will continue.

 

 

© Tata Technologies 2009-2015. All rights reserved.