
Mathematical modeling and computer simulation pdf Continue Process of mathematical modelling, performed on a computer This article is about computer model within a scientific context. For simulating a computer on a computer, see emulator. Computer model leads here. For computer models of 3 dimensional objects, see 3D modelling. Process of building a computer model, and the interaction between experiment, simulation, and theory. Computer simulation is the process of mathematical modelling, performed on a computer, designed to predict the behavior of or the outcome of an actual world or physical system. As they allow to check the reliability of chosen mathematical models, Computer simulations have a useful tool for the mathematical modelling of many natural systems in physics (computational physics), astrophysics, climatology, chemistry, biology and manufacturing, as well as human systems in the economy, psychology, social science, health Simulation of a system is represented as managing the system's model. It can be used to explore and gain new insights into new technology and to estimate the performance of systems complex for analytical solutions. [1] Computer simulations are realized by running computer programs that can be small, which run almost immediately on small devices, or large-scale programs running on network-based groups of computers for hours or days. The scale of events simulated by computer simulations has much exceeded anything possible (or perhaps even imaginable) using traditional paper-and-pencil mathematical modelling. In 1997, a desert-battle simulation of one power invasion involved another involved the modelling of 66,239 tanks, trucks and other vehicles on simulated site around Kuwait, using several supercomputers in the DoD High Performance Computer Modernization Program. [2] Other examples include a 1-billion atomic model of material deformation; [3] A 2.64 million-million atomic model of the complex protein-producing organelles of all living organisms, the ribosome, in 2005; [4] a complete simulation of the life cycle of Mycoplasma genitalium in 2012; and the Blue Brain Project at EPFL (Switzerland), beginning in May 2005 to create the first computer simulation of the entire human brain, right down to the molecular level. [5] Due to the calculation costs of simulation, computer experiments are used to carry out distractions such as uncertainty quantification. [6] Simulation towards model A computer model is the algorithms and equations used to capture the behaviour of the system modeled. In contrast, computer simulation is the actual management of the program that contains these equations or algorithms. Simulation is therefore the process of managing a model. So would one not build a simulation; instead would one build a model, and then either run the model or equivalent run a simulation. History computer simulation developed with the rapid growth of the computer, following its first large-scale deployment during the Manhattan Project in World War II to model the process of nuclear deployment. It was a simulation of 12 hard spheres using a Monte Carlo algorithm. Computer simulation is often used as an add-on to, or substitute for, modeling systems for which simple closed form analytical solutions are not possible. There are many types of computer simulations; Their common feature is the attempt to generate a sample of representative scenarios for a model in which a complete summary of all possible states of the model would be prohibited or impossible. [7] Data preparation The external data requirements of simulations and models vary widely. For some, the input can only take a few numbers (for example, simulation of a wave form of AC electricity on a wire), while others may require terabytes of information (such as weather and climate models). Import sources also vary widely: Sensors and other physical devices connected to the model; Control surfaces used to guide the progress of the simulation in some way; Current or historical data entered manually; Values extracted as a byproduct from other processes; Values output for purpose through other simulations, models, or processes. Lastly, the time data available varies: invariate data is often built into the model code, either because the value is truly invariant (e.g. the value of π) or because the designers consider the value as invariant for all instances of interest; data can be imported into the simulation when starting, for example by reading one or more files, or by reading data from a pre-processor; data can be provided during the simulation run, for example by a sensor network. Due to this variety, and because diverse simulation systems have many common elements, there are a large number of specialized simulation languages. The most famous can be Simula (sometimes called Simula-67, after the year 1967 when it was proposed). There are now many others. Systems that accept data from external sources should be very careful to know what they are receiving. While it is easy for computers to read in values from text or binary files, which is much harder to know the accuracy (compared to meting resolution and precision) of the values. Often they are expressed as error bars, a minimum and maximum deviation from the value range within which the true value (expected) lies. Because digital computer mathematicians are not perfect, this error multiplys, so it is useful to perform an error analysis[8] to confirm that values through the simulation will still be useful. Types of Computer models can be classified according to multiple independent pairs of properties, including: deterministic (and as a special case of deterministic, deterministic, – see external links below for examples of stoastic vs. deterministic simulations Steady-state or dynamic Continuous or discrete (and as an important special case of discrete, Discrete event or DE models) Dynamic system simulation, e.g. electrical systems, hydraulic systems or multi-body mechanical systems (mainly described by DAYS: s) or dynamics simulation of field problems, e.g. CFD of FEM simulations (described by PDE :s) or dynamic simulation of field problems, e.g. CFD of FEM simulations (described by PDE:s). Local or distributed. Another way to categorize models is to look at the underlying data structures. For time-step simulations, there are two main classes: Simulations that store their data in regular grids and require only next-neighbor access called stencil codes. Many CFD applications belong to this category. If the underlying chart is not a regular grid, the model can belong to the meshfree method class. Equations define the relationships between elements of the modeled system and attempt to find a state in which the system is equilibrium. Such models are often used in simulating physical systems, as a simpler modeling case before dynamic simulation is attempted. Dynamic simulations model changes in a system in response to (usually modible) input signals. Stoastic models use random number generators to model odds or random events; A discrete event simulation (DES) manages events on time. Most computer, logic testing and error-tree simulations are of this type. In this type of simulation, the simulator maintains a string of events sorted according to the simulated time they should take place. The simulator reads the rope and triggers new events as each event is processed. It is not important to perform the simulation in real time. It is often more important to access the data provided by discovering the simulation and logical defects in the design or the order of events. An ongoing dynamic simulation performs numerical solution of differential-algebraic equations or differential equations (whether partial or ordinary). From time to time, the simulation program solves all the equations and uses the numbers to change the state and output of the simulation. Applications include flight simulators, construction and management simulation games, chemical process modeling, and simulations of electric circuits. Originally, these kinds of simulations were actually implemented on analog computers, where the differential equations can be represented directly by various electrical components such as on-amps. By the late 1980s, however, most analog simulations were carried out on conventional digital computers that follow the behavior of an analog computer. A special type simulation that does not rely on a model with an underlying equation, but can nevertheless be formally represented, is agent-based simulation. In agent-based simulation, the individual entities (such as molecules, cells, trees or consumers) in the are represented directly (rather than by their density or concentration) and possess an internal state and state behaviours or rules that determine how the state of the agent is updated from once to the next. Distributed models run on a network of connected computers, possibly through the Internet. Simulations spread across various host computers like these are often referred to as dispersed simulations. There are several standards for dispersed simulation, including Aggregate Level Simulation Protocol (ALSP), Distributed Interactive Simulation (DIS), the High Level Architecture (Simulation) (HLA) and the Test and Training enable architecture (TENA). Visualization Previously, the output data of a computer simulation was sometimes presented in a table or a matrick showing how data was affected by numerous changes in the simulation parameters. The use of the matris format was related to traditional use
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages2 Page
-
File Size-