29
Massive Molecular Dynamics Simulation for Studying 100 Million Atoms System 김김김 , 김김김 , 김김김 , 김김김 , 김김김 , 김김김 , 김김김 김김김김김김김김김 , 김김김김김김김김 전전전전전전 전전전전 , 2006. 10. 26-27, 전전 ICC

Massive Molecular Dynamics Simulation for Studying 100 Million Atoms System 김상필, 이승협, 김경수, 이승철, 최정혜, 이규환, 이광렬 한국과학기술연구원,

Embed Size (px)

Citation preview

  • Massive Molecular Dynamics Simulation for Studying 100 Million Atoms System, , , , , , , , 2006. 10. 26-27, ICC

  • Deposition in Co-Al SystemCo on AlAl on Co

  • Simulation of DefectsMaterials science is all about the defects.

  • Cover Image of Nature Materials (2006. 10.)Nature Materials, 5, 805-809 (2006).Previous MD simulations have not seen the evolution of the strain from one-to three-dimensional compression that is observed in diffraction experiments. Our large-scale MD simulations of up to 352 million atoms resolve this important discrepancy through a detailed understanding of dislocation flow at high strain rates.

  • Nanoscience and Nanomaterials

  • Simulation in (sub) Atomic ScaleFirst Principle CalculationMolecular Dynamic Simulation 10 nm : >1,000,000 atoms100 nm : >1,000,000,000 atoms

  • () Advanced first principle approachReliable parameter DB (TB parameters, force field)

    hardware software

    Multiscale simulation algorithm

    Man-Machine-Interface

  • TCAD Atomic scale description of the devices Electron transport in subatomic scale and via non-continuum media

  • (KIST)1 MD/MC

    Electron transport CMOS FET TCAD prototype

  • The present talkIntroduction to LAMMPS (parallel MD code)Modification or improvement for materials sciencePublic release informations50M atoms system demoPerformance report at KIST grand supercomputer

  • LAMMPSLarge-scale Atomic/Molecular Massively Parallel SimulatorBy Dr. S. Plimpton (http://www.cs.sandia.gov/~sjplimp)

    Classical MD Code with MPI parallelism (Suitable for PC cluster system)Spatial-decomposition of simulation domainRuns on a single processor or in parallelOpen-source distribution (C++)atomic, polymeric, biological, metallic, granular, or hybrid systems

    400 MD L-J potential for 100 steps

  • LAMMPS Implemented Potentialpairwise potentials: Lennard-Jones, Coulombic, Buckingham, Morse, Yukawa, frictional granular, tabulated, hybrid molecular potentials: bond, angle, dihedral, improper, class 2 (COMPASS) polymer potentials: all-atom, united-atom, bead-spring long-range Coulombics: Ewald and PPPM (similar to particle-mesh Ewald) CHARMM and AMBER force-field compatability

  • LAMMPS Modification of LAMMPS KIST-LAMMPSIntegration of Tersoff potentials for Si, Ge, C interactionsEAM potential data base for various transition metalsImplement of a powerful crystal builder for thin film process simulation

  • Public Release of KIST-LAMMPS

  • Public Release of KIST-LAMMPS

  • Crystal BuilderGraphiteDiamondLonsdaleitTetrahedral Carbon (4-ring)Zincblende-tridymite

  • Diamond Surface(011)(111)

  • New CommandsNew crystal builder

    orient x i j k orient y i j k orient z i j k specify orientation tensor of the surface lattice vector a a : lattice parameter in fill box xi xf yi yf zi zf specify box size : length in x axis = a*(xf-xi) fill primitive x1 x2 x3 y1 y2 y3 z1 z2 z3 specify primitive vector of each axis fill basis n {ni xi yi zi} specify basis atom and its position n : number of basis atoms ni : index of atom specified in potential file xi yi zi : position of i atom w.r.t origin {} : repeat for i=1..n New force field type (to be released)

    manybody [Tersoff | Brenner] Specify new potential category : manybody and specify the new potential : Tersoff or Brenner Si, Ge, C, N, O

  • Benchmark of LAMMPSKIST-LAMMPS (Sep. 2006)EAM (Au), Tersoff (Si)300K nvt ensemblet = 1 fs, 1,000 steps

    @ grand2.kist.re.krIntel Xeon 2.5GHz, 2Gbytes RAM1024 CPU, Myrinet3.07 TFlops

  • Elapsed Time EAMTersoff

  • 50,000,000 Atoms System DemoScenario2 Ar atoms of 10keV Au (001) substrate of 50M atoms

    Calculation conditionEAM+ZBL potentialAu substrate : 204 163.2 25.5 nm3Position output every 10,000 stepst = 1 fs128 cpu @ grand2.kist.re.krUsed memory : 28Gbytes in total

  • Elapsed Time

    Chart3

    30151

    130.1

    2401.5

    402.34

    453.28

    Sheet1

    Pair Time30151

    Neigh. Time130.1

    Comm. Time2401.5

    Output Time402.34

    Others453.28

    Sheet1

    0

    0

    0

    0

    0

    Sheet2

    Sheet3

  • Visualization by AtomEye*Visualization using AtomEye**http://164.107.79.177/Archive/Graphics/A3/A3.htmlUp to 4 M atoms: Serial VersionBeyond 4 M atom: Parallel VersionAtomEye is memory and I/O intensive program.1 million atoms16 CPUs (Parallel version)

    AtomsMemory1 M900 MB5 M4.2 GB10 M9.3 GB15 M12.0 GB20 M16.3 GB25 M21.3 GB30 M24.3 GB35 M32.2 GB40 M36.7 GB45 M42 GB50 M48 GB

    NodesMemory1817 MB2850 MB3900 MB4880 MB9954 MB161024 MB

  • Public Release of KIST-LAMMPShttp://diamond.kist.re.kr/SMS

  • 4th Conference of ACCMS (2007. 9. 13~16),

    http:// www-lab.imr.edu/~accms

  • Founded in Aug. 2000 @ SendaiProf. Y. Kawazoe (Tohoku Univ.) Prof. G. P. Das (IACS) Prof. B. L. Gu (Tsinghua Univ.)To enhance corporation in Asian region on computational materials science.Similar to Psi-k network or Pan American Consortium of Theorists (PACT)

    Conferences of ACCMS1st Conference (Bangalore, India, Nov. 29 Dec. 1, 2001.)2nd Conference (Novosibirsk, Russia, Jul. 14-16, 2004.)3rd Conference (Beijing, China, Sep. 8-11, 2005.)WGM on Clusters and Nanomaterials (Sendai, Japan, Sep. 7-9, 2006.)4th Conference (Seoul, Korea, Sep. 13-16, 2007)

  • ScopeFrom ab-initio calc. to MD/MC simulationFrom electron transport, materials process, property preditionFrom molecular devices to alloys

    Participant : ~150 Plenary lecture (45min) : 3Invited (25min) : 21Contributed Oral (15min) : 25Contributed Poster : 77