xref: /petsc/doc/community/meetings/2023/index.md (revision 09b68a49ed2854d1e4985cc2aa6af33c7c4e69b3)
109cc9507SBarry Smith---
209cc9507SBarry Smithorphan: true
309cc9507SBarry Smith---
409cc9507SBarry Smith
509cc9507SBarry Smith(2023_meeting)=
609cc9507SBarry Smith
709cc9507SBarry Smith# 2023 Annual PETSc Meeting
809cc9507SBarry Smith
909cc9507SBarry Smith```{image} https://petsc.gitlab.io/annual-meetings/2023/GroupPhoto.jpg
1009cc9507SBarry Smith:alt: PETSc User Meeting 2023 group photo (Hermann Hall, 06/06/2023)
1109cc9507SBarry Smith:width: 800
1209cc9507SBarry Smith```
1309cc9507SBarry Smith
1409cc9507SBarry SmithJune 5-7, 2023, at the [Hermann Hall Conference Center](https://www.iit.edu/event-services/meeting-spaces/hermann-hall-conference-center)
1509cc9507SBarry Smithin the Hermann Ballroom (when you enter the Hermann Hall building through the main entrance walk straight back to the rear of the building and take a right)
1609cc9507SBarry Smith(3241 South Federal Street, Chicago, IL)
1709cc9507SBarry Smithon the campus of [The Illinois Institute of Technology (IIT)](https://www.iit.edu) in Chicago.
1809cc9507SBarry SmithEasy access from the hotels via the Chicago Elevated [Green](https://www.transitchicago.com/greenline) or [Red](https://www.transitchicago.com/redline) Lines.
1909cc9507SBarry Smith[Parking use B5 (32nd & Federal St.)](https://www.iit.edu/cbsc/parking/visitor-and-event-parking).
2009cc9507SBarry Smith
2109cc9507SBarry SmithPlease test for Covid before attending the meeting and
2209cc9507SBarry Smithmask while traveling to the meeting.
2309cc9507SBarry Smith
2409cc9507SBarry SmithIn addition to a newbie user tutorial and a {any}`newbie_developer_workshop`, the meeting will include a "speed dating" session where users can ask questions of developers (and each other) about technical details of their particular simulations. Finally, the meeting will be interspersed with mini-tutorials that will dive into particular aspects of PETSc that users may not be familiar with.
2509cc9507SBarry Smith
2609cc9507SBarry Smith## Meeting times
2709cc9507SBarry Smith
2809cc9507SBarry Smith- Monday, June 5: 1 pm to 5:30 pm
2909cc9507SBarry Smith- Tuesday, June 6: 10:15 am to 5:30 pm
3009cc9507SBarry Smith- Wednesday, June 7: 9 am to 3 pm
3109cc9507SBarry Smith
3209cc9507SBarry SmithPETSc newbie user lightning tutorial:
3309cc9507SBarry Smith
3409cc9507SBarry Smith- Monday, June 5: 10 am to 12 pm
3509cc9507SBarry Smith
3609cc9507SBarry SmithPETSc {any}`newbie_developer_workshop`
3709cc9507SBarry Smith
3809cc9507SBarry Smith- Tuesday, June 6: 9 am to 10 am
3909cc9507SBarry Smith
4009cc9507SBarry Smith## Registration
4109cc9507SBarry Smith
4209cc9507SBarry SmithPlease register at [EventBrite](https://www.eventbrite.com/e/petsc-2023-user-meeting-tickets-494165441137) to save your seat. 100-dollar registration fee for breaks and lunches; this can be skipped if you cannot afford it.
4309cc9507SBarry Smith
4409cc9507SBarry Smith## Submit a presentation
4509cc9507SBarry Smith
4609cc9507SBarry Smith[Submit an abstract](https://docs.google.com/forms/d/e/1FAIpQLSesh47RGVb9YD9F1qu4obXSe1X6fn7vVmjewllePBDxBItfOw/viewform) by May 1st (but preferably now) to be included in the schedule. We welcome talks from all perspectives, including those who
4709cc9507SBarry Smith
4809cc9507SBarry Smith- contribute to PETSc,
4909cc9507SBarry Smith- use PETSc in their applications or libraries,
5009cc9507SBarry Smith- develop the libraries and packages [called from PETSc](https://petsc.org/release/install/external_software/), and even
5109cc9507SBarry Smith- those who are curious about using PETSc in their applications.
5209cc9507SBarry Smith
5309cc9507SBarry Smith## Suggested hotels
5409cc9507SBarry Smith
5509cc9507SBarry Smith- [Receive IIT hotel discounts.](https://www.iit.edu/procurement-services/purchasing/preferred-and-contract-vendors/hotels)
5609cc9507SBarry Smith
5709cc9507SBarry Smith- More Expensive
5809cc9507SBarry Smith
5909cc9507SBarry Smith  - [Hilton Chicago](https://www.hilton.com/en/hotels/chichhh-hilton-chicago/?SEO_id=GMB-AMER-HI-CHICHHH&y_source=1_NzIxNzU2LTcxNS1sb2NhdGlvbi53ZWJzaXRl) 720 S Michigan Ave, Chicago
6009cc9507SBarry Smith  - [Hotel Blake, an Ascend Hotel Collection Member](https://www.choicehotels.com/illinois/chicago/ascend-hotels/il480) 500 S Dearborn St, Chicago, IL 60605
6109cc9507SBarry Smith  - [The Blackstone, Autograph Collection](https://www.marriott.com/en-us/hotels/chiab-the-blackstone-autograph-collection/overview/?scid=f2ae0541-1279-4f24-b197-a979c79310b0) 636 South Michigan Avenue Lobby Entrance On, E Balbo Dr, Chicago
6209cc9507SBarry Smith
6309cc9507SBarry Smith- Inexpensive
6409cc9507SBarry Smith
6509cc9507SBarry Smith  - [Travelodge by Wyndham Downtown Chicago](https://www.wyndhamhotels.com/travelodge/chicago-illinois/travelodge-hotel-downtown-chicago/overview?CID=LC:TL::GGL:RIO:National:10073&iata=00093796) 65 E Harrison St, Chicago
6609cc9507SBarry Smith  - [The Congress Plaza Hotel & Convention Center](https://www.congressplazahotel.com/?utm_source=local-directories&utm_medium=organic&utm_campaign=travelclick-localconnect) 520 S Michigan Ave, Chicago
6709cc9507SBarry Smith  - [Hilton Garden Inn Chicago Downtown South Loop](https://www.hilton.com/en/hotels/chidlgi-hilton-garden-inn-chicago-downtown-south-loop/?SEO_id=GMB-AMER-GI-CHIDLGI&y_source=1_MTI2NDg5NzktNzE1LWxvY2F0aW9uLndlYnNpdGU%3D) 55 E 11th St, Chicago
6809cc9507SBarry Smith
6909cc9507SBarry Smith## Agenda
7009cc9507SBarry Smith
7109cc9507SBarry Smith### Monday, June 5
7209cc9507SBarry Smith
7309cc9507SBarry Smith| Time     | Title                                                                                                                        | Speaker                 |
7409cc9507SBarry Smith| -------- | ---------------------------------------------------------------------------------------------------------------------------- | ----------------------- |
7509cc9507SBarry Smith| 10:00 am | Newbie tutorial ([Slides][s_00], [Video][v_00])                                                                              |                         |
7609cc9507SBarry Smith| 11:30 am | Follow-up questions and meetings                                                                                             |                         |
7709cc9507SBarry Smith| 12:00 am | **Lunch** for tutorial attendees and early arrivees                                                                          |                         |
7809cc9507SBarry Smith| 1:00 pm  | Some thoughts on the future of PETSc ([Slides][s_01], [Video][v_01])                                                         | [Barry Smith]           |
7909cc9507SBarry Smith| 1:30 pm  | A new nonhydrostatic capability for MPAS-Ocean ([Slides][s_02], [Video][v_02])                                               | [Sara Calandrini]       |
8009cc9507SBarry Smith| 2:00 pm  | MultiFlow: A coupled balanced-force framework to solve multiphase flows in arbitrary domains ([Slides][s_03], [Video][v_03]) | [Berend van Wachem]     |
8109cc9507SBarry Smith| 2:30 pm  | Mini tutorial: PETSc and PyTorch interoperability ([Slides][s_04], [Video][v_04], [IPython code][c_04])                      | [Hong Zhang (Mr.)]      |
8209cc9507SBarry Smith| 2:45 pm  | **Coffee Break**                                                                                                             |                         |
8309cc9507SBarry Smith| 3:00 pm  | Towards enabling digital twins capabilities for a cloud chamber (slides and video unavailable)                               | [Vanessa Lopez-Marrero] |
8409cc9507SBarry Smith| 3:30 pm  | PETSc ROCKS ([Slides][s_06], [Video][v_06])                                                                                  | [David May]             |
8509cc9507SBarry Smith| 4:00 pm  | Software Development and Deployment Including PETSc ([Slides][s_07], [Video][v_07])                                          | [Tim Steinhoff]         |
8609cc9507SBarry Smith| 4:30 pm  | Multiscale, Multiphysics Simulation Through Application Composition Using MOOSE ([Slides][s_08], [Video][v_08])              | [Derek Gaston]          |
8709cc9507SBarry Smith| 5:00 pm  | PETSc Newton Trust-Region for Simulating Large-scale Engineered Subsurface Systems with PFLOTRAN ([Slides][s_09])            | [Heeho Park]            |
8809cc9507SBarry Smith| 5:30 pm  | End of first day                                                                                                             |                         |
8909cc9507SBarry Smith
9009cc9507SBarry Smith### Tuesday, June 6
9109cc9507SBarry Smith
9209cc9507SBarry Smith| Time     | Title                                                                                                                                                   | Speaker                  |
9309cc9507SBarry Smith| -------- | ------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------ |
9409cc9507SBarry Smith|          |                                                                                                                                                         |                          |
9509cc9507SBarry Smith| 9:00 am  | Newbie Developer Workshop (optional)                                                                                                                    |                          |
9609cc9507SBarry Smith| 10:00 am | **Coffee Break**                                                                                                                                        |                          |
9709cc9507SBarry Smith| 10:15 am | Experiences in solving nonlinear eigenvalue problems with SLEPc ([Slides][s_10], [Video][v_10])                                                         | [Jose E. Roman]          |
9809cc9507SBarry Smith| 10:45 am | MPI Multiply Threads ([Slides][s_11], [Video][v_11])                                                                                                    | [Hui Zhou]               |
9909cc9507SBarry Smith| 11:15 am | Mini tutorial: PETSc on the GPU ([Slides][s_12], [Video][v_12])                                                                                         | [Junchao Zhang]          |
10009cc9507SBarry Smith| 11:30 am | AMD GPU benchmarking, documentation, and roadmap ([Slides][s_13], video unavailable)                                                                    | [Justin Chang]           |
10109cc9507SBarry Smith| 12:00 pm | **Lunch**                                                                                                                                               |                          |
10209cc9507SBarry Smith| 1:00 pm  | Mini tutorial: petsc4py ([Slides][s_14], [Video][v_14])                                                                                                 | [Stefano Zampini]        |
10309cc9507SBarry Smith| 1:15 pm  | Transparent Asynchronous Compute Made Easy With PETSc ([Slides][s_15], [Video][v_15])                                                                   | [Jacob Faibussowitsch]   |
10409cc9507SBarry Smith| 1:45 pm  | Using Kokkos Ecosystem with PETSc on modern architectures ([Slides][s_16])                                                                              | [Luc Berger-Vergiat]     |
10509cc9507SBarry Smith| 2:15 pm  | Intel oneAPI Math Kernel Library, what’s new and what’s next? ([Slides][s_17], [Video][v_17])                                                           | [Spencer Patty]          |
10609cc9507SBarry Smith| 2:45 pm  | Mini tutorial: DMPlex ([Video][v_18], slides unavailable)                                                                                               | [Matt Knepley]           |
10709cc9507SBarry Smith| 3:00 pm  | **Coffee Break**                                                                                                                                        |                          |
10809cc9507SBarry Smith| 3:15 pm  | Scalable cloud-native thermo-mechanical solvers using PETSc (slides and video unavailable)                                                              | [Ashish Patel]           |
10909cc9507SBarry Smith| 3:45 pm  | A mimetic finite difference based quasi-static magnetohydrodynamic solver for force-free plasmas in tokamak disruptions ([Slides][s_20], [Video][v_20]) | [Zakariae Jorti]         |
11009cc9507SBarry Smith| 4:15 pm  | High-order FEM implementation in AMReX using PETSc ([Slides][s_21], [Video][v_21])                                                                      | [Alex Grant]             |
11109cc9507SBarry Smith| 4:45 pm  | An Immersed Boundary method for Elastic Bodies Using PETSc ([Slides][s_22], [Video][v_22])                                                              | [Mohamad Ibrahim Cheikh] |
11209cc9507SBarry Smith| 5:15 pm  | Mini tutorial: DMNetwork ([Slides][s_23], [Video][v_23])                                                                                                | [Hong Zhang (Ms.)]       |
11309cc9507SBarry Smith| 5:30 pm  | End of second day                                                                                                                                       |                          |
11409cc9507SBarry Smith
11509cc9507SBarry Smith### Wednesday, June 7
11609cc9507SBarry Smith
11709cc9507SBarry Smith| Time     | Title                                                                                                                               | Speaker                             |
11809cc9507SBarry Smith| -------- | ----------------------------------------------------------------------------------------------------------------------------------- | ----------------------------------- |
11909cc9507SBarry Smith| 9:00 am  | XGCm: An Unstructured Mesh Gyrokinetic Particle-in-cell Code for Exascale Fusion Plasma Simulations ([Slides][s_24], [Video][v_24]) | [Chonglin Zhang]                    |
12009cc9507SBarry Smith| 9:30 am  | PETSc-PIC: A Structure-Preserving Particle-In-Cell Method for Electrostatic Solves ([Slides][s_25], [Video][v_25])                  | [Daniel Finn]                       |
12109cc9507SBarry Smith| 9:57 am  | Landau Collisions in the Particle Basis with PETSc-PIC ([Slides][s_26], [Video][v_26])                                              | [Joseph Pusztay]                    |
12209cc9507SBarry Smith| 10:15 am | **Coffee Break**                                                                                                                    |                                     |
12309cc9507SBarry Smith| 10:30 am | Mini tutorial: DMSwarm ([Slides][s_27], [Video][v_27])                                                                              | [Joseph Pusztay\*][joseph pusztay*] |
12409cc9507SBarry Smith| 10:45 am | Scalable Riemann Solvers with the Discontinuous Galerkin Method for Hyperbolic Network Simulation ([Slides][s_28], [Video][v_28])   | [Aidan Hamilton]                    |
12509cc9507SBarry Smith| 11:15 am | Numerical upscaling of network models using PETSc ([Slides][s_29], [Video][v_29])                                                   | [Maria Vasilyeva]                   |
12609cc9507SBarry Smith| 11:45 am | Mini tutorial: TaoADMM ([Slides][s_30], [Video][v_30])                                                                              | [Hansol Suh]                        |
12709cc9507SBarry Smith| 12:00 am | **Lunch**                                                                                                                           |                                     |
12809cc9507SBarry Smith| 1:00 pm  | PETSc in the Ionosphere ([Slides][s_31], [Video][v_31])                                                                             | [Matt Young]                        |
12909cc9507SBarry Smith| 1:30 pm  | From the trenches: porting mef90 ([Slides][s_32], [Video][v_32])                                                                    | [Blaise Bourdin]                    |
13009cc9507SBarry Smith| 2:00 pm  | PERMON library for quadratic programming ([Slides][s_33], [Video][v_33])                                                            | [Jakub Kruzik]                      |
13109cc9507SBarry Smith| 2:22 pm  | Distributed Machine Learning for Natural Hazard Applications Using PERMON ([Slides][s_34], [Video][v_34])                           | [Marek Pecha]                       |
13209cc9507SBarry Smith| 2:45 pm  | Wrap up                                                                                                                             |                                     |
13309cc9507SBarry Smith| 3:00 pm  | End of meeting                                                                                                                      |                                     |
13409cc9507SBarry Smith
13509cc9507SBarry Smith(newbie_developer_workshop)=
13609cc9507SBarry Smith
13709cc9507SBarry Smith## Newbie Developer Workshop
13809cc9507SBarry Smith
13909cc9507SBarry SmithTuesday, June 6, at 9 am. Some of the topics to be covered.
14009cc9507SBarry Smith
14109cc9507SBarry Smith- {any}`Exploring the developer documentation<ind_developers>`
14209cc9507SBarry Smith
14309cc9507SBarry Smith- {any}`petsc_developers_communication_channels`
14409cc9507SBarry Smith
14509cc9507SBarry Smith- {any}`PETSc Git branch organization<sec_integration_branches>`
14609cc9507SBarry Smith
14709cc9507SBarry Smith- {any}`ch_contributing`
14809cc9507SBarry Smith
14909cc9507SBarry Smith  - {any}`Starting a merge request (MR)<ch_developingmr>`
15009cc9507SBarry Smith  - {any}`Submitting and monitoring a MR<ch_submittingmr>`
15109cc9507SBarry Smith  - {any}`GitLab CI pipelines<pipelines>`
15209cc9507SBarry Smith  - {any}`PETSc style guide<style>`
15309cc9507SBarry Smith
15409cc9507SBarry Smith- Reviewing someone else's MR
15509cc9507SBarry Smith
15609cc9507SBarry Smith- Adding new Fortran and Python function bindings
15709cc9507SBarry Smith
15809cc9507SBarry Smith- PETSc's
15909cc9507SBarry Smith
16009cc9507SBarry Smith  - {any}`configure system<ch_buildsystem>`
16109cc9507SBarry Smith  - compiler system, and
16209cc9507SBarry Smith  - {any}`testing system including the GitLab CI<test_harness>`
16309cc9507SBarry Smith
16409cc9507SBarry Smith- Any other topics requested by potential contributors
16509cc9507SBarry Smith
16609cc9507SBarry Smith## Abstracts
16709cc9507SBarry Smith
16809cc9507SBarry Smith(luc-berger-vergiat)=
16909cc9507SBarry Smith
17009cc9507SBarry Smith:::{topic} **Using Kokkos Ecosystem with PETSc on modern architectures**
17109cc9507SBarry Smith**Luc Berger-Vergiat**
17209cc9507SBarry Smith
17309cc9507SBarry SmithSandia National Laboratories
17409cc9507SBarry Smith
17509cc9507SBarry SmithSupercomputers increasingly rely on GPUs to achieve high
17609cc9507SBarry Smiththroughput while maintaining a reasonable power consumption. Consequently,
17709cc9507SBarry Smithscientific applications are adapting to this new environment, and new
17809cc9507SBarry Smithalgorithms are designed to leverage the high concurrency of GPUs. In this
17909cc9507SBarry Smithpresentation, I will show how the Kokkos Ecosystem can help alleviate some
18009cc9507SBarry Smithof the difficulties associated with support for multiple CPU/GPU
18109cc9507SBarry Smitharchitectures. I will also show some results using the Kokkos and Kokkos
18209cc9507SBarry Smithkernels libraries with PETSc on modern architectures.
18309cc9507SBarry Smith:::
18409cc9507SBarry Smith
18509cc9507SBarry Smith(blaise-bourdin)=
18609cc9507SBarry Smith
18709cc9507SBarry Smith:::{topic} **From the trenches: porting mef90**
18809cc9507SBarry Smith**Blaise Bourdin**
18909cc9507SBarry Smith
19009cc9507SBarry SmithMcMaster University
19109cc9507SBarry Smith
19209cc9507SBarry Smithmef90 is a distributed three-dimensional unstructured finite-element
19309cc9507SBarry Smithimplementation of various phase-field models of fracture. In this talk,
194*f0b74427SPierre JolivetI will share the experience gained while porting mef90 from PETSc 3.3 to 3.18.
19509cc9507SBarry Smith:::
19609cc9507SBarry Smith
19709cc9507SBarry Smith(sara-calandrini)=
19809cc9507SBarry Smith
19909cc9507SBarry Smith:::{topic} **A new non-hydrostatic capability for MPAS-Ocean**
20009cc9507SBarry Smith**Sara Calandrini**
20109cc9507SBarry Smith
20209cc9507SBarry Smith, Darren Engwirda, Luke Van Roekel
20309cc9507SBarry Smith
20409cc9507SBarry SmithLos Alamos National Laboratory
20509cc9507SBarry Smith
20609cc9507SBarry SmithThe Model for Prediction Across Scales-Ocean (MPAS-Ocean) is an
20709cc9507SBarry Smithopen-source, global ocean model and is one component within the Department
20809cc9507SBarry Smithof Energy’s E3SM framework, which includes atmosphere, sea ice, and
20909cc9507SBarry Smithland-ice models. In this work, a new formulation for the ocean model is
21009cc9507SBarry Smithpresented that solves the non-hydrostatic, incompressible Boussinesq
21109cc9507SBarry Smithequations on unstructured meshes. The introduction of this non-hydrostatic
21209cc9507SBarry Smithcapability is necessary for the representation of fine-scale dynamical
21309cc9507SBarry Smithprocesses, including resolution of internal wave dynamics and large eddy
21409cc9507SBarry Smithsimulations. Compared to the standard hydrostatic formulation,
21509cc9507SBarry Smitha non-hydrostatic pressure solver and a vertical momentum equation are
21609cc9507SBarry Smithadded, where the PETSc (Portable Extensible Toolkit for Scientific
21709cc9507SBarry SmithComputation) library is used for the inversion of a large sparse system for
21809cc9507SBarry Smiththe nonhydrostatic pressure. Numerical results comparing the solutions of
21909cc9507SBarry Smiththe hydrostatic and non-hydrostatic models are presented, and the parallel
22009cc9507SBarry Smithefficiency and accuracy of the time-stepper are evaluated.
22109cc9507SBarry Smith:::
22209cc9507SBarry Smith
22309cc9507SBarry Smith(justin-chang)=
22409cc9507SBarry Smith
22509cc9507SBarry Smith:::{topic} **AMD GPU benchmarking, documentation, and roadmap**
22609cc9507SBarry Smith**Justin Chang**
22709cc9507SBarry Smith
22809cc9507SBarry SmithAMD Inc.
22909cc9507SBarry Smith
23009cc9507SBarry SmithThis talk comprises three parts. First, we present an overview of some
23109cc9507SBarry Smithrelatively new training documentation like the "AMD lab notes" to enable
23209cc9507SBarry Smithcurrent and potential users of AMD GPUs into getting the best experience
23309cc9507SBarry Smithout of their applications or algorithms. Second, we briefly discuss
23409cc9507SBarry Smithimplementation details regarding the PETSc HIP backend introduced into the
23509cc9507SBarry SmithPETSc library late last year and present some performance benchmarking data
23609cc9507SBarry Smithon some of the AMD hardware. Lastly, we give a preview of the upcoming
23709cc9507SBarry SmithMI300 series APU and how software developers can prepare to leverage this
23809cc9507SBarry Smithnew type of accelerator.
23909cc9507SBarry Smith:::
24009cc9507SBarry Smith
24109cc9507SBarry Smith(mohamad-ibrahim-cheikh)=
24209cc9507SBarry Smith
24309cc9507SBarry Smith:::{topic} **An Immersed Boundary method for Elastic Bodies Using PETSc**
24409cc9507SBarry Smith**Mohamad Ibrahim Cheikh**
24509cc9507SBarry Smith
24609cc9507SBarry Smith, Konstantin Doubrovinski
24709cc9507SBarry Smith
24809cc9507SBarry SmithDoubrovinski Lab, The University of Texas Southwestern Medical Center
24909cc9507SBarry Smith
25009cc9507SBarry SmithThis study presents a parallel implementation of an immersed boundary
25109cc9507SBarry Smithmethod code using the PETSc distributed memory module. This work aims to simulate a complex developmental process that occurs in the
25209cc9507SBarry Smithearly stages of embryonic development, which involves the transformation of
25309cc9507SBarry Smiththe embryo into a multilayered and multidimensional structure. To
25409cc9507SBarry Smithaccomplish this, the researchers used the PETSc parallel module to solve
25509cc9507SBarry Smitha linear system for the Eulerian fluid dynamics while simultaneously
25609cc9507SBarry Smithcoupling it with a deforming Lagrangian elastic body to model the
25709cc9507SBarry Smithdeformable embryonic tissue. This approach allows for a detailed simulation
25809cc9507SBarry Smithof the interaction between the fluid and the tissue, which is critical for
25909cc9507SBarry Smithaccurately modeling the developmental process. Overall, this work
26009cc9507SBarry Smithhighlights the potential of the immersed boundary method and parallel
26109cc9507SBarry Smithcomputing techniques for simulating complex physical phenomena.
26209cc9507SBarry Smith:::
26309cc9507SBarry Smith
26409cc9507SBarry Smith(jacob-faibussowitsch)=
26509cc9507SBarry Smith
26609cc9507SBarry Smith:::{topic} **Transparent Asynchronous Compute Made Easy With PETSc**
26709cc9507SBarry Smith**Jacob Faibussowitch**
26809cc9507SBarry Smith
26909cc9507SBarry SmithArgonne National Laboratory
27009cc9507SBarry Smith
27109cc9507SBarry SmithAsynchronous GPU computing has historically been difficult to integrate scalably at the library level. We provide an update on recent work
27209cc9507SBarry Smithimplementing a fully asynchronous framework in PETSc. We give detailed
27309cc9507SBarry Smithperformance comparisons and provide a demo to showcase the proposed model's effectiveness
27409cc9507SBarry Smithand ease of use.
27509cc9507SBarry Smith:::
27609cc9507SBarry Smith
27709cc9507SBarry Smith(daniel-finn)=
27809cc9507SBarry Smith
27909cc9507SBarry Smith:::{topic} **PETSc-PIC: A Structure-Preserving Particle-In-Cell Method for Electrostatic Solves**
28009cc9507SBarry Smith**Daniel Finn**
28109cc9507SBarry Smith
28209cc9507SBarry SmithUniversity at Buffalo
28309cc9507SBarry Smith
28409cc9507SBarry SmithNumerical solutions to the Vlasov-Poisson equations have important
28509cc9507SBarry Smithapplications in the fields of plasma physics, solar physics, and cosmology.
28609cc9507SBarry SmithThe goal of this research is to develop a structure-preserving,
28709cc9507SBarry Smithelectrostatic and gravitational Vlasov-Poisson(-Landau) model using the
28809cc9507SBarry SmithPortable, Extensible Toolkit for Scientific Computation (PETSc) and study
28909cc9507SBarry Smiththe presence of Landau damping in a variety of systems, such as
29009cc9507SBarry Smiththermonuclear fusion reactors and galactic dynamics. The PETSc
29109cc9507SBarry SmithParticle-In-Cell (PETSc-PIC) model is a highly scalable,
29209cc9507SBarry Smithstructure-preserving PIC method with multigrid capabilities. In the PIC
29309cc9507SBarry Smithmethod, a hybrid discretization is constructed with a grid of finitely
29409cc9507SBarry Smithsupported basis functions to represent the electric, magnetic, and/or
29509cc9507SBarry Smithgravitational fields, and a distribution of delta functions to represent
29609cc9507SBarry Smiththe particle field. Collisions are added to the formulation using
29709cc9507SBarry Smitha particle-basis Landau collision operator recently added to the PETSc
29809cc9507SBarry Smithlibrary.
29909cc9507SBarry Smith:::
30009cc9507SBarry Smith
30109cc9507SBarry Smith(derek-gaston)=
30209cc9507SBarry Smith
30309cc9507SBarry Smith:::{topic} **Multiscale, Multiphysics Simulation Through Application Composition Using MOOSE**
30409cc9507SBarry Smith**Derek Gaston**
30509cc9507SBarry Smith
30609cc9507SBarry SmithIdaho National Laboratory
30709cc9507SBarry Smith
30809cc9507SBarry SmithEight years ago, at the PETSc 20 meeting, I introduced the idea of
30909cc9507SBarry Smith"Simplifying Multiphysics Through Application Composition" -- the idea
31009cc9507SBarry Smiththat physics applications can be built in such a way that they can
31109cc9507SBarry Smithinstantly be combined to tackle complicated multiphysics problems.
31209cc9507SBarry SmithThis talk will serve as an update on those plans. I will detail the
31309cc9507SBarry Smithevolution of that idea, how we’re using it in practice, how well it’s
31409cc9507SBarry Smithworking, and where we’re going next. Motivating examples will be drawn
31509cc9507SBarry Smithfrom nuclear engineering, and practical aspects, such as testing, will
31609cc9507SBarry Smithbe explored.
31709cc9507SBarry Smith:::
31809cc9507SBarry Smith
31909cc9507SBarry Smith(alex-grant)=
32009cc9507SBarry Smith
32109cc9507SBarry Smith:::{topic} **High-order FEM implementation in AMReX using PETSc**
32209cc9507SBarry Smith**Alex Grant**
32309cc9507SBarry Smith
32409cc9507SBarry Smith, Karthik Chockalingam, Xiaohu Guo
32509cc9507SBarry Smith
32609cc9507SBarry SmithScience and Technology Facilities Council (STFC), UK
32709cc9507SBarry Smith
32809cc9507SBarry SmithAMReX is a C++ block-structured framework for adaptive mesh refinement,
32909cc9507SBarry Smithtypically used for finite difference or finite volume codes. We describe
33009cc9507SBarry Smitha first attempt at a finite element implementation in AMReX using PETSc.
33109cc9507SBarry SmithAMReX splits the domain of uniform elements into rectangular boxes at each
33209cc9507SBarry Smithrefinement level, with higher levels overlapping rather than replacing
33309cc9507SBarry Smithlower levels and with each level solved independently. AMReX boxes can be
33409cc9507SBarry Smithcell-centered or nodal; we use cell centered boxes to represent the geometry
33509cc9507SBarry Smithand mesh and nodal boxes to identify nodes to constrain and store results
33609cc9507SBarry Smithfor visualization. We convert AMReX’s independent spatial indices into
33709cc9507SBarry Smitha single global index, then use MATMPIAIJ to assemble the system matrix per
33809cc9507SBarry Smithrefinement level. In an unstructured grid, isoparametric mapping is
33909cc9507SBarry Smithrequired for each element; using a structured grid avoids both this
34009cc9507SBarry Smithand indirect addressing, which provides significant potential performance
34109cc9507SBarry Smithadvantages. We have solved time-dependent parabolic equations and seen
34209cc9507SBarry Smithperformance gains compared to unstructured finite elements. Further
34309cc9507SBarry Smithdevelopments will include arbitrary higher-order schemes and
34409cc9507SBarry Smithmulti-level hp refinement with arbitrary hanging nodes. PETSc uses AMReX
34509cc9507SBarry Smithdomain decomposition to partition the matrix and right-hand vectors. For
34609cc9507SBarry Smitheach higher level, not all of the domain will be refined, but AMReX’s
34709cc9507SBarry Smithindices cover the whole space - this poses an indexing challenge and can
34809cc9507SBarry Smithlead to over-allocation of memory. It is still to be explored whether DM
34909cc9507SBarry Smithdata structures would provide a benefit over MATMPIAIJ.
35009cc9507SBarry Smith:::
35109cc9507SBarry Smith
35209cc9507SBarry Smith(aidan-hamilton)=
35309cc9507SBarry Smith
35409cc9507SBarry Smith:::{topic} **Scalable Riemann Solvers with the Discontinuous Galerkin Method for Hyperbolic Network Simulation**
35509cc9507SBarry Smith**Aidan Hamilton**
35609cc9507SBarry Smith
35709cc9507SBarry Smith, Jing-Mei Qiu, Hong Zhang
35809cc9507SBarry Smith
35909cc9507SBarry SmithUniversity of Delaware
36009cc9507SBarry Smith
36109cc9507SBarry SmithWe develop highly efficient and effective computational algorithms
36209cc9507SBarry Smithand simulation tools for fluid simulations on a network. The mathematical
36309cc9507SBarry Smithmodels are a set of hyperbolic conservation laws on the edges of a network, as
36409cc9507SBarry Smithwell as coupling conditions on junctions of a network. For example, the
36509cc9507SBarry Smithshallow water system, together with flux balance and continuity conditions
36609cc9507SBarry Smithat river intersections, model water flows on a river network. The
36709cc9507SBarry Smithcomputationally accurate and robust discontinuous Galerkin methods,
36809cc9507SBarry Smithcoupled with explicit strong-stability preserving Runge-Kutta methods, are
36909cc9507SBarry Smithimplemented for simulations on network edges. Meanwhile, linear and
37009cc9507SBarry Smithnonlinear scalable Riemann solvers are being developed and implemented at
37109cc9507SBarry Smithnetwork vertices. These network simulations result in tools built using
37209cc9507SBarry SmithPETSc and DMNetwork software libraries for the scientific community in
37309cc9507SBarry Smithgeneral. Simulation results of a shallow water system on a Mississippi
37409cc9507SBarry Smithriver network with over one billion network variables are performed on an
37509cc9507SBarry Smithextreme- scale computer using up to 8,192 processors with an optimal
37609cc9507SBarry Smithparallel efficiency. Further potential applications include traffic flow
37709cc9507SBarry Smithsimulations on a highway network and blood flow simulations on an arterial
37809cc9507SBarry Smithnetwork, among many others
37909cc9507SBarry Smith:::
38009cc9507SBarry Smith
38109cc9507SBarry Smith(zakariae-jorti)=
38209cc9507SBarry Smith
38309cc9507SBarry Smith:::{topic} **A mimetic finite difference based quasi-static magnetohydrodynamic solver for force-free plasmas in tokamak disruptions**
38409cc9507SBarry Smith**Zakariae Jorti**
38509cc9507SBarry Smith
38609cc9507SBarry Smith, Qi Tang, Konstantin Lipnikov, Xianzhu Tang
38709cc9507SBarry Smith
38809cc9507SBarry SmithLos Alamos National Laboratory
38909cc9507SBarry Smith
39009cc9507SBarry SmithForce-free plasmas are a good approximation in the low-beta case, where the
39109cc9507SBarry Smithplasma pressure is tiny compared with the magnetic pressure. On time scales
39209cc9507SBarry Smithlong compared with the transit time of Alfvén waves, the evolution of
39309cc9507SBarry Smitha force-free plasma is most efficiently described by a quasi-static
39409cc9507SBarry Smithmagnetohydrodynamic (MHD) model, which ignores the plasma inertia. In this
39509cc9507SBarry Smithwork, we consider a regularized quasi-static MHD model for force-free
39609cc9507SBarry Smithplasmas in tokamak disruptions and propose a mimetic finite difference
39709cc9507SBarry Smith(MFD) algorithm, which is targeted at applications such as the cold
39809cc9507SBarry Smithvertical displacement event (VDE) of a major disruption in an ITER-like
39909cc9507SBarry Smithtokamak reactor. In the case of whole device modeling, we further consider
40009cc9507SBarry Smiththe two sub-domains of the plasma region and wall region and their coupling
40109cc9507SBarry Smiththrough an interface condition. We develop a parallel, fully implicit, and
40209cc9507SBarry Smithscalable MFD solver based on PETSc and its DMStag data structure to discretize the five-field quasi-static perpendicular plasma dynamics
40309cc9507SBarry Smithmodel on a 3D structured mesh. The MFD spatial discretization is coupled
40409cc9507SBarry Smithwith a fully implicit DIRK scheme. The full algorithm exactly preserves the
40509cc9507SBarry Smithdivergence-free condition of the magnetic field under a generalized Ohm’s
40609cc9507SBarry Smithlaw. The preconditioner employed is a four-level fieldsplit preconditioner,
40709cc9507SBarry Smithcreated by combining separate preconditioners for individual
40809cc9507SBarry Smithfields, that calls multigrid or direct solvers for sub-blocks or exact
40909cc9507SBarry Smithfactorization on the separate fields. The numerical results confirm the
41009cc9507SBarry Smithdivergence-free constraint is strongly satisfied and demonstrate the
41109cc9507SBarry Smithperformance of the fieldsplit preconditioner and overall algorithm. The
41209cc9507SBarry Smithsimulation of ITER VDE cases over the actual plasma current diffusion time
41309cc9507SBarry Smithis also presented.
41409cc9507SBarry Smith:::
41509cc9507SBarry Smith
41609cc9507SBarry Smith(jakub-kruzik)=
41709cc9507SBarry Smith
41809cc9507SBarry Smith:::{topic} **PERMON library for quadratic programming**
41909cc9507SBarry Smith**Jakub Kruzik**
42009cc9507SBarry Smith
42109cc9507SBarry Smith, Marek Pecha, David Horak
42209cc9507SBarry Smith
42309cc9507SBarry SmithVSB - Technical University of Ostrava, Czechia
42409cc9507SBarry Smith
42509cc9507SBarry SmithPERMON (Parallel, Efficient, Robust, Modular, Object-oriented, Numerical)
42609cc9507SBarry Smithis a library based on PETSc for solving quadratic programming (QP)
42709cc9507SBarry Smithproblems. We will present PERMON usage on our implementation of the FETI
42809cc9507SBarry Smith(finite element tearing and interconnecting) method. This FETI
42909cc9507SBarry Smithimplementation involves a chain of QP transformations, such as
43009cc9507SBarry Smithdualization, which simplifies a given QP. We will also discuss some useful
43109cc9507SBarry Smithoptions, like viewing Karush-Kuhn-Tucker (optimality) conditions for each
43209cc9507SBarry SmithQP in the chain. Finally, we will showcase some QP applications solved by
43309cc9507SBarry SmithPERMON, such as the solution of contact problems for hydro-mechanical
43409cc9507SBarry Smithproblems with discrete fracture networks or the solution of support vector
43509cc9507SBarry Smithmachines using the PermonSVM module.
43609cc9507SBarry Smith:::
43709cc9507SBarry Smith
43809cc9507SBarry Smith(vanessa-lopez-marrero)=
43909cc9507SBarry Smith
44009cc9507SBarry Smith:::{topic} **Towards enabling digital twins capabilities for a cloud chamber**
44109cc9507SBarry Smith**Vanessa Lopez-Marrero**
44209cc9507SBarry Smith
44309cc9507SBarry Smith, Kwangmin Yu, Tao Zhang, Mohammad Atif, Abdullah Al Muti Sharfuddin, Fan Yang, Yangang Liu, Meifeng Lin, Foluso Ladeinde, Lingda Li
44409cc9507SBarry Smith
44509cc9507SBarry SmithBrookhaven National Laboratory
44609cc9507SBarry Smith
44709cc9507SBarry SmithParticle-resolved direct numerical simulations (PR-DNS), which resolve not
44809cc9507SBarry Smithonly the smallest turbulent eddies but also track the development and
44909cc9507SBarry Smiththe motion of individual particles, are an essential tool for studying
45009cc9507SBarry Smithaerosol-cloud-turbulence interactions. For instance, PR-DNS may complement
45109cc9507SBarry Smithexperimental facilities designed to study key physical processes in
45209cc9507SBarry Smitha controlled environment and therefore serve as digital twins for such
45309cc9507SBarry Smithcloud chambers. In this talk, we will present our ongoing work aimed at
45409cc9507SBarry Smithenabling the use of PR-DNS for this purpose. We will describe the physical
45509cc9507SBarry Smithmodel used, which consists of a set of fluid dynamics equations for
45609cc9507SBarry Smithair velocity, temperature, and humidity coupled with a set of equations
45709cc9507SBarry Smithfor particle (i.e., droplet) growth/tracing. The numerical method used to
45809cc9507SBarry Smithsolve the model, which employs PETSc solvers in its implementation, will be
45909cc9507SBarry Smithdiscussed, as well as our current efforts to assess performance and
46009cc9507SBarry Smithscalability of the numerical solver.
46109cc9507SBarry Smith:::
46209cc9507SBarry Smith
46309cc9507SBarry Smith(david-may)=
46409cc9507SBarry Smith
46509cc9507SBarry Smith:::{topic} **PETSc ROCKS**
46609cc9507SBarry Smith**David May**
46709cc9507SBarry Smith
46809cc9507SBarry SmithUniversity of California, San Diego
46909cc9507SBarry Smith
47009cc9507SBarry SmithThe field of Geodynamics is concerned with understanding
47109cc9507SBarry Smiththe deformation history of the solid Earth over millions to billions of
47209cc9507SBarry Smithyear time scales. The infeasibility of extracting a spatially and
47309cc9507SBarry Smithtemporally complete geological record based on rocks that are currently
47409cc9507SBarry Smithexposed at the surface of the Earth compels many geodynamists to employ
47509cc9507SBarry Smithcomputational simulations of geological processes.
47609cc9507SBarry Smith
47709cc9507SBarry SmithIn this presentation I will discuss several geodynamic software packages
47809cc9507SBarry Smithwhich utilize PETSc. I intend to highlight how PETSc has played an
47909cc9507SBarry Smithimportant role in enabling and advancing state-of-the-art in geodynamic
48009cc9507SBarry Smithsoftware. I will also summarize my own experiences and observations of how
48109cc9507SBarry Smithgeodynamic-specific functionality has driven the
48209cc9507SBarry Smithdevelopment of new general-purpose PETSc functionality.
48309cc9507SBarry Smith:::
48409cc9507SBarry Smith
48509cc9507SBarry Smith(heeho-park)=
48609cc9507SBarry Smith
48709cc9507SBarry Smith:::{topic} **PETSc Newton Trust-Region for Simulating Large-scale Engineered Subsurface Systems with PFLOTRAN**
48809cc9507SBarry Smith**Heeho Park**
48909cc9507SBarry Smith
49009cc9507SBarry Smith, Glenn Hammond, Albert Valocchi
49109cc9507SBarry Smith
49209cc9507SBarry SmithSandia National Laboratories
49309cc9507SBarry Smith
49409cc9507SBarry SmithModeling large-scale engineered subsurface systems entails significant
49509cc9507SBarry Smithadditional numerical challenges. For nuclear waste repository, the
49609cc9507SBarry Smithchallenges arise from: (a) the need to accurately represent both the waste
49709cc9507SBarry Smithform processes and shafts, tunnel, and barriers at the small spatial scale
49809cc9507SBarry Smithand the large-scale transport processes throughout geological formations;
49909cc9507SBarry Smith(b) the strong contrast in material properties such as porosity and
50009cc9507SBarry Smithpermeability, and the nonlinear constitutive relations for multiphase flow;
50109cc9507SBarry Smith(c) the decay of high level nuclear wastes cause nearby water to boil off
50209cc9507SBarry Smithinto steam leading to dry-out. These can lead to an ill-conditioned
50309cc9507SBarry SmithJacobian matrix and non-convergence with Newton’s method due to
50409cc9507SBarry Smithdiscontinuous nonlinearity in constitutive models.
50509cc9507SBarry Smith
50609cc9507SBarry SmithWe apply the open-source simulator PFLOTRAN which employs a FV
50709cc9507SBarry Smithdiscretization and uses the PETSc parallel framework. We implement within
50809cc9507SBarry SmithPETSc the general-purpose nonlinear solver, Newton trust-region dogleg
50909cc9507SBarry SmithCauchy (NTRDC) and Newton trust-region (NTR) to demonstrate the
51009cc9507SBarry Smitheffectiveness of these advanced solvers. The results demonstrate speed-up
51109cc9507SBarry Smithcompared to the default solvers of PETSc and complete simulations that were
51209cc9507SBarry Smithnever completed with them.
51309cc9507SBarry Smith
51409cc9507SBarry SmithSNL is managed and operated by NTESS under DOE NNSA contract DE-NA0003525.
51509cc9507SBarry Smith:::
51609cc9507SBarry Smith
51709cc9507SBarry Smith(ashish-patel)=
51809cc9507SBarry Smith
51909cc9507SBarry Smith:::{topic} **Scalable cloud-native thermo-mechanical solvers using PETSc**
52009cc9507SBarry Smith**Ashish Patel**
52109cc9507SBarry Smith
52209cc9507SBarry Smith, Jeremy Theler, Francesc Levrero-Florencio, Nabil Abboud, Mohammad Sarraf Joshaghani, Scott McClennan
52309cc9507SBarry Smith
52409cc9507SBarry SmithAnsys, Inc.
52509cc9507SBarry Smith
52609cc9507SBarry SmithThis talk presents how the Ansys OnScale team uses PETSc to
52709cc9507SBarry Smithdevelop finite element-based thermo-mechanical solvers for scalable
52809cc9507SBarry Smithnonlinear simulations on the cloud. We will first provide an overview of
52909cc9507SBarry Smithfeatures available in the solver and then discuss how some of the PETSc
53009cc9507SBarry Smithobjects, like DMPlex and TS, have helped us speed up our development
53109cc9507SBarry Smithprocess. We will also talk about the workarounds we have incorporated to
53209cc9507SBarry Smithaddress the current limitations of some of the functions from DMPlex for
53309cc9507SBarry Smithour use cases involving multi-point constraints and curved elements.
53409cc9507SBarry SmithFinally, we demonstrate how PETSc’s linear solvers scale on multi-node
53509cc9507SBarry Smithcloud instances.
53609cc9507SBarry Smith:::
53709cc9507SBarry Smith
53809cc9507SBarry Smith(spencer-patty)=
53909cc9507SBarry Smith
54009cc9507SBarry Smith:::{topic} **Intel oneAPI Math Kernel Library, what’s new and what’s next?**
54109cc9507SBarry Smith**Spencer Patty**
54209cc9507SBarry Smith
54309cc9507SBarry SmithIntel Corporation
54409cc9507SBarry Smith
54509cc9507SBarry SmithThis talk provides an overview of Intel® oneAPI Math Kernel Library (oneMKL)
54609cc9507SBarry Smithproduct and software for supporting optimized math routines for both Intel
54709cc9507SBarry SmithCPUs and GPUs. Given that PETSc already utilizes several BLAS/LAPACK/Sparse
54809cc9507SBarry SmithBLAS routines from oneMKL for Intel CPU and as part of the Aurora project
54909cc9507SBarry Smithwith Argonne, we discuss the use of OpenMP offload APIs for Intel GPUs.
55009cc9507SBarry SmithWe explore software and hardware improvements for better sparse linear
55109cc9507SBarry Smithalgebra performance and have an informal discussion of how to further
55209cc9507SBarry Smithsupport the PETSc community.
55309cc9507SBarry Smith:::
55409cc9507SBarry Smith
55509cc9507SBarry Smith(marek-pecha)=
55609cc9507SBarry Smith
55709cc9507SBarry Smith:::{topic} **Distributed Machine Learning for Natural Hazard Applications Using PERMON**
55809cc9507SBarry Smith**Marek Pecha**
55909cc9507SBarry Smith
56009cc9507SBarry Smith, David Horak, Richard Tran Mills, Zachary Langford
56109cc9507SBarry Smith
56209cc9507SBarry SmithVSB – Technical University of Ostrava, Czechia
56309cc9507SBarry Smith
56409cc9507SBarry SmithWe will present a software solution for distributed machine learning
56509cc9507SBarry Smithsupporting computation on multiple GPUs running on the top of the PETSc
56609cc9507SBarry Smithframework, which we will demonstrate in applications related to natural
56709cc9507SBarry Smithhazard localizations and detections employing supervised uncertainties
56809cc9507SBarry Smithmodeling. It is called PERMON and is designed for convex optimization
56909cc9507SBarry Smithusing quadratic programming, and its extension PermonSVM implements
57009cc9507SBarry Smithmaximal-margin classifier approaches associated with support vector
57109cc9507SBarry Smithmachines (SVMs). Although deep learning (DL) is getting popular in recent
57209cc9507SBarry Smithyears, SVMs are still applicable. However, unlike DL, the SVM approach requires
57309cc9507SBarry Smithadditional feature engineering or feature selection. We will present our
57409cc9507SBarry Smithworkflow and show how to achieve reasonable models for the application
57509cc9507SBarry Smithrelated to wildfire localization in Alaska.
57609cc9507SBarry Smith:::
57709cc9507SBarry Smith
57809cc9507SBarry Smith(joseph-pusztay)=
57909cc9507SBarry Smith
58009cc9507SBarry Smith:::{topic} **Landau Collisions in the Particle Basis with PETSc-PIC**
58109cc9507SBarry Smith**Joseph Pusztay**
58209cc9507SBarry Smith
58309cc9507SBarry Smith, Matt Knepley, Mark Adams
58409cc9507SBarry Smith
58509cc9507SBarry SmithUniversity at Buffalo
58609cc9507SBarry Smith
58709cc9507SBarry SmithThe kinetic description of plasma encompasses the fine scale interaction of
58809cc9507SBarry Smiththe various bodies that it is comprised of, and applies to a litany of
58909cc9507SBarry Smithexperiments ranging from the laboratory magnetically confined fusion
59009cc9507SBarry Smithplasma, to the scale of the solar corona. Of great import to these
59109cc9507SBarry Smithdescriptions are collisions in the grazing limit, which transfer momentum
59209cc9507SBarry Smithbetween components of the plasma. Until recently, these have best been
59309cc9507SBarry Smithdescribed conservatively by finite element discretizations of the Landau
59409cc9507SBarry Smithcollision integral. In recent years a particle discretization has been
59509cc9507SBarry Smithproven to preserve the appropriate eigenfunctions of the system, as well as
59609cc9507SBarry Smithphysically relevant quantities. I present here the recent work on a purely
59709cc9507SBarry Smithparticle discretized Landau collision operator which preserves mass,
59809cc9507SBarry Smithmomentum, and energy, with associated accuracy benchmarks in PETSc.
59909cc9507SBarry Smith:::
60009cc9507SBarry Smith
60109cc9507SBarry Smith(jose-e-roman)=
60209cc9507SBarry Smith
60309cc9507SBarry Smith:::{topic} **Experiences in solving nonlinear eigenvalue problems with SLEPc**
60409cc9507SBarry Smith**Jose E. Roman**
60509cc9507SBarry Smith
60609cc9507SBarry SmithUniversitat Politècnica de València
60709cc9507SBarry Smith
60809cc9507SBarry SmithOne of the unique features of SLEPc is the module for the general nonlinear
60909cc9507SBarry Smitheigenvalue problem (NEP), where we want to compute a few eigenvalues and
61009cc9507SBarry Smithcorresponding eigenvectors of a large-scale parameter-dependent matrix
61109cc9507SBarry SmithT(lambda). In this talk, we will illustrate the use of NEP in the context
61209cc9507SBarry Smithof two applications, one of them coming from the characterization of
61309cc9507SBarry Smithresonances in nanophotonic devices, and the other one from a problem in
61409cc9507SBarry Smithaeroacoustics.
61509cc9507SBarry Smith:::
61609cc9507SBarry Smith
61709cc9507SBarry Smith(barry-smith)=
61809cc9507SBarry Smith
61909cc9507SBarry Smith:::{topic} **Some thoughts on the future of PETSc**:
62009cc9507SBarry Smith**Barry Smith**
62109cc9507SBarry Smith
62209cc9507SBarry SmithFlatiron Institute
62309cc9507SBarry Smith
62409cc9507SBarry SmithHow will PETSc evolve and grow in the future? How can PETSc algorithms and
62509cc9507SBarry Smithsimulations be integrated into the emerging world of machine learning and
62609cc9507SBarry Smithdeep neural networks? I will provide an informal discussion of these topics
62709cc9507SBarry Smithand my thoughts.
62809cc9507SBarry Smith:::
62909cc9507SBarry Smith
63009cc9507SBarry Smith(tim-steinhoff)=
63109cc9507SBarry Smith
63209cc9507SBarry Smith:::{topic} **Software Development and Deployment Including PETSc**
63309cc9507SBarry Smith**Tim Steinhoff**
63409cc9507SBarry Smith
63509cc9507SBarry Smith, Volker Jacht
63609cc9507SBarry Smith
63709cc9507SBarry SmithGesellschaft für Anlagen- und Reaktorsicherheit (GRS), Germany
63809cc9507SBarry Smith
63909cc9507SBarry SmithOnce it is decided that PETSc shall handle certain numerical subtasks in
64009cc9507SBarry Smithyour software the question may arise about how to smoothly incorporate PETSc
64109cc9507SBarry Smithinto the overall software development and deployment processes. In this
64209cc9507SBarry Smithtalk, we present our approach how to handle such a situation for the code
64309cc9507SBarry Smithfamily AC2 which is developed and distributed by GRS. AC2 is used to
64409cc9507SBarry Smithsimulate the behavior of nuclear reactors during operation, transients,
64509cc9507SBarry Smithdesign basis and beyond design basis accidents up to radioactive releases
64609cc9507SBarry Smithto the environment. The talk addresses our experiences, what challenges had
64709cc9507SBarry Smithto be overcome, and how we make use of GitLab, CMake, and Docker techniques
64809cc9507SBarry Smithto establish clean incorporation of PETSc into our software development
64909cc9507SBarry Smithcycle.
65009cc9507SBarry Smith:::
65109cc9507SBarry Smith
65209cc9507SBarry Smith(hansol-suh)=
65309cc9507SBarry Smith
65409cc9507SBarry Smith:::{topic} **TaoADMM**
65509cc9507SBarry Smith**Hansol Suh**
65609cc9507SBarry Smith
65709cc9507SBarry SmithArgonne National Laboratory
65809cc9507SBarry Smith
65909cc9507SBarry SmithIn this tutorial, we will be giving an introduction to ADMM algorithm on
66009cc9507SBarry SmithTAO. It will include walking through ADMM algorithm with some real-life
66109cc9507SBarry Smithexample, and tips on setting up the framework to solve ADMM on PETSc/TAO.
66209cc9507SBarry Smith:::
66309cc9507SBarry Smith
66409cc9507SBarry Smith(maria-vasilyeva)=
66509cc9507SBarry Smith
66609cc9507SBarry Smith:::{topic} **Numerical upscaling of network models using PETSc**
66709cc9507SBarry Smith**Maria Vasilyeva**
66809cc9507SBarry Smith
66909cc9507SBarry SmithTexas A&M University-Corpus Christi
67009cc9507SBarry Smith
67109cc9507SBarry SmithMultiphysics models on large networks are used in many applications, for
67209cc9507SBarry Smithexample, pore network models in reservoir simulation, epidemiological
67309cc9507SBarry Smithmodels of disease spread, ecological models on multispecies interaction,
67409cc9507SBarry Smithmedical applications such as multiscale multidimensional simulations of
67509cc9507SBarry Smithblood flow, etc. This work presents the construction of the numerical
67609cc9507SBarry Smithupscaling and multiscale method for network models. An accurate
67709cc9507SBarry Smithcoarse-scale approximation is generated by solving local problems in
67809cc9507SBarry Smithsub-networks. Numerical implementation of the network model is performed
67909cc9507SBarry Smithbased on the PETSc DMNetwork framework. Results are presented for square
68009cc9507SBarry Smithand random heterogeneous networks generated by OpenPNM.
68109cc9507SBarry Smith:::
68209cc9507SBarry Smith
68309cc9507SBarry Smith(berend-van-wachem)=
68409cc9507SBarry Smith
68509cc9507SBarry Smith:::{topic} **MultiFlow: A coupled balanced-force framework to solve multiphase flows in arbitrary domains**
68609cc9507SBarry Smith**Berend van Wachem**
68709cc9507SBarry Smith
68809cc9507SBarry Smith, Fabien Evrard
68909cc9507SBarry Smith
69009cc9507SBarry SmithUniversity of Magdeburg, Germany
69109cc9507SBarry Smith
69209cc9507SBarry SmithSince 2000, we have been working on a finite-volume numerical framework
69309cc9507SBarry Smith“MultiFlow ” to predict multiphase flows in arbitrary domains by solving
69409cc9507SBarry Smithvarious flavors of the incompressible and compressible Navier-Stokes
69509cc9507SBarry Smithequations using PETSc. This framework enables the simulation of creeping,
69609cc9507SBarry Smithlaminar and turbulent flows with droplets and/or particles at various
69709cc9507SBarry Smithscales. It relies on a collocated variable arrangement of the unknown
69809cc9507SBarry Smithvariables and momentum-weighted-interpolation to determine the fluxes at
69909cc9507SBarry Smiththe cell faces to couple velocity and pressure. To maximize robustness, the
70009cc9507SBarry Smithgoverning flow equations are solved in a coupled fashion, i.e., as part of
70109cc9507SBarry Smitha single equation system involving all flow variables. Various modules are
70209cc9507SBarry Smithavailable within the code in addition to its core flow solver, allowing it to
70309cc9507SBarry Smithmodel interfacial and particulate flows at various flow regimes and scales.
70409cc9507SBarry SmithThe framework heavily relies on the PETSc library not only to solve the
70509cc9507SBarry Smithsystem of governing equations but also for the handling of unknown
70609cc9507SBarry Smithvariables, parallelization of the computational domain, and exchange of
70709cc9507SBarry Smithdata over processor boundaries. We are now in the 3rd generation of our
70809cc9507SBarry Smithcode, currently using a combination of DMDA, and DMPlex with DMForest/p4est
70909cc9507SBarry Smithframeworks to allow for the adaptive octree refinement of the
71009cc9507SBarry Smithcomputational mesh. In this contribution, we will present the details of
71109cc9507SBarry Smiththe discretization and the parallel implementation of our framework and
71209cc9507SBarry Smithdescribe its interconnection with the PETSc library. We will then present
71309cc9507SBarry Smithsome applications of our framework, simulating multiphase flows at various
71409cc9507SBarry Smithscales, flows regimes, and resolutions. During this contribution, we will
71509cc9507SBarry Smithalso discuss our framework's challenges and future objectives.
71609cc9507SBarry Smith:::
71709cc9507SBarry Smith
71809cc9507SBarry Smith(matt-young)=
71909cc9507SBarry Smith
72009cc9507SBarry Smith:::{topic} **PETSc in the Ionosphere**
72109cc9507SBarry Smith**Matt Young**
72209cc9507SBarry Smith
72309cc9507SBarry SmithUniversity of New Hampshire
72409cc9507SBarry Smith
72509cc9507SBarry SmithA planet's ionosphere is the region of its atmosphere where a fraction
72609cc9507SBarry Smithof the constituent atoms or molecules have separated into positive ions and
72709cc9507SBarry Smithelectrons. Earth's ionosphere extends from roughly 85 km during the day
72809cc9507SBarry Smith(higher at night) to the edge of space. This partially ionized regime
72909cc9507SBarry Smithexhibits collective behavior and supports electromagnetic phenomena that do
73009cc9507SBarry Smithnot exist in the neutral (i.e., unionized) atmosphere. Furthermore, the
73109cc9507SBarry Smithabundance of neutral atoms and molecules leads to phenomena that do not
73209cc9507SBarry Smithexist in the fully ionized space environment. In a relatively narrow
73309cc9507SBarry Smithaltitude range of Earth's ionosphere called the "E region", electrons
73409cc9507SBarry Smithbehave as typical charged particles -- moving in response to combined
73509cc9507SBarry Smithelectric and magnetic fields -- while ions collide too frequently with
73609cc9507SBarry Smithneutral molecules to respond to the magnetic field. This difference leads
73709cc9507SBarry Smithto the Farley-Buneman instability when the local electric field is strong
73809cc9507SBarry Smithenough. The Farley-Buneman instability regularly produces irregularities in
73909cc9507SBarry Smiththe charged-particle densities that are strong enough to reflect radio
74009cc9507SBarry Smithsignals. Recent research suggests that fully developed turbulent
74109cc9507SBarry Smithstructures can disrupt GPS communication.
74209cc9507SBarry Smith
74309cc9507SBarry SmithThe Electrostatic Parallel Particle-in-Cell (EPPIC) numerical simulation
74409cc9507SBarry Smithself-consistently models instability growth and evolution in the E-region
74509cc9507SBarry Smithionosphere. The simulation includes a hybrid mode that treats electrons as
74609cc9507SBarry Smitha fluid and treats ions as particles. The particular fluid electron model
74709cc9507SBarry Smithrequires the solution of an elliptic partial differential equation for the
74809cc9507SBarry Smithelectrostatic potential at each time step, which we represent as a linear
74909cc9507SBarry Smithsystem that the simulation solves with PETSc. This presentation will
75009cc9507SBarry Smithdescribe the original development of the 2D hybrid simulation, previous
75109cc9507SBarry Smithresults, recent efforts to extend to 3D, and implications for modeling GPS
75209cc9507SBarry Smithscintillation.
75309cc9507SBarry Smith
75409cc9507SBarry SmithThe Electrostatic Parallel Particle-in-Cell (EPPIC) numerical simulation
75509cc9507SBarry Smithself-consistently models instability growth and evolution in the E-region
75609cc9507SBarry Smithionosphere. The simulation includes a hybrid mode that treats electrons as
75709cc9507SBarry Smitha fluid and treats ions as particles. The particular fluid electron model
75809cc9507SBarry Smithrequires the solution of an elliptic partial differential equation for the
75909cc9507SBarry Smithelectrostatic potential at each time step, which we represent as a linear
76009cc9507SBarry Smithsystem that the simulation solves with PETSc. This presentation will describe
76109cc9507SBarry Smiththe original development of the 2D hybrid simulation, previous results, recently
76209cc9507SBarry Smithefforts to extend to 3D, and implications to modeling GPS scintillation.
76309cc9507SBarry Smith:::
76409cc9507SBarry Smith
76509cc9507SBarry Smith(chonglin-zhang)=
76609cc9507SBarry Smith
76709cc9507SBarry Smith:::{topic} **XGCm: An Unstructured Mesh Gyrokinetic Particle-in-cell Code for Exascale Fusion Plasma Simulations**
76809cc9507SBarry Smith**Chonglin Zhang**
76909cc9507SBarry Smith
77009cc9507SBarry Smith, Cameron W. Smith, Mark S. Shephard
77109cc9507SBarry Smith
77209cc9507SBarry SmithRensselaer Polytechnic Institute (RPI)
77309cc9507SBarry Smith
77409cc9507SBarry SmithWe report the development of XGCm, a new distributed unstructured mesh
77509cc9507SBarry Smithgyrokinetic particle-in-cell (PIC) code, short for x-point included
77609cc9507SBarry Smithgyrokinetic code mesh-based. The code adopts the physical algorithms of the
77709cc9507SBarry Smithwell-established XGC code. It is intended as a testbed for experimenting
77809cc9507SBarry Smithwith new numerical and computational algorithms, which can eventually be
77909cc9507SBarry Smithadopted in XGC and other PIC codes. XGCm is developed on top of several
78009cc9507SBarry Smithopen-source libraries, including Kokkos, PETSc, Omega, and PUMIPic. Omega
78109cc9507SBarry Smithand PUMIPic rely on Kokkos to interact with the GPU accelerator, while
78209cc9507SBarry SmithPETSc solves the gyrokinetic Poisson equation on either CPU or GPU. We
78309cc9507SBarry Smithfirst discuss the numerical algorithms of our mesh-centric approach for
78409cc9507SBarry Smithperforming PIC calculations. We then present a code validation study using
78509cc9507SBarry Smiththe cyclone base case with ion temperature gradient turbulence (case 5 from
78609cc9507SBarry SmithBurckel, etc. Journal of Physics: Conference Series 260, 2010, 012006).
78709cc9507SBarry SmithFinally, we discuss the performance of XGCm and present weak scaling
78809cc9507SBarry Smithresults using up to the full system (27,648 GPUs) of the Oak Ridge National
78909cc9507SBarry SmithLaboratory’s Summit supercomputer. Overall, XGCm executes all PIC
79009cc9507SBarry Smithoperations on the GPU accelerators and exhibits good performance and
79109cc9507SBarry Smithportability.
79209cc9507SBarry Smith:::
79309cc9507SBarry Smith
79409cc9507SBarry Smith(hong-zhang-ms)=
79509cc9507SBarry Smith
79609cc9507SBarry Smith:::{topic} **PETSc DMNetwork: A Library for Scalable Network PDE-Based Multiphysics Simulation**
79709cc9507SBarry Smith**Hong Zhang (Ms.)**
79809cc9507SBarry Smith
79909cc9507SBarry SmithArgonne National Laboratory, Illinois Institute of Technology
80009cc9507SBarry Smith
80109cc9507SBarry SmithWe present DMNetwork, a high-level set of routines included in the PETSc
80209cc9507SBarry Smithlibrary for the simulation of multiphysics phenomena over large-scale
80309cc9507SBarry Smithnetworked systems. The library aims at applications with networked
80409cc9507SBarry Smithstructures like those in electrical, water, and traffic
80509cc9507SBarry Smithdistribution systems. DMNetwork provides data and topology management,
80609cc9507SBarry Smithparallelization for multiphysics systems over a network, and hierarchical
80709cc9507SBarry Smithand composable solvers to exploit the problem structure. DMNetwork eases
80809cc9507SBarry Smiththe simulation development cycle by providing the necessary infrastructure
80909cc9507SBarry Smithto define and query the network components through simple abstractions.
81009cc9507SBarry Smith:::
81109cc9507SBarry Smith
81209cc9507SBarry Smith(hui-zhou)=
81309cc9507SBarry Smith
81409cc9507SBarry Smith:::{topic} **MPI Multiply Threads**
81509cc9507SBarry Smith**Hui Zhou**
81609cc9507SBarry Smith
81709cc9507SBarry SmithArgonne National Laboratory
81809cc9507SBarry Smith
81909cc9507SBarry SmithIn the traditional MPI+Thread programming paradigm, MPI and OpenMP each
82009cc9507SBarry Smithform their own parallelization. MPI is unaware of the thread
82109cc9507SBarry Smithcontext. The requirement of thread safety and message ordering forces MPI
82209cc9507SBarry Smithlibrary to blindly add critical sections, unnecessarily serializing the
82309cc9507SBarry Smithcode. On the other hand, OpenMP cannot use MPI for inter-thread
82409cc9507SBarry Smithcommunications. Developers often need hand-roll algorithms for
82509cc9507SBarry Smithcollective operations and non-blocking synchronizations.
82609cc9507SBarry Smith
82709cc9507SBarry SmithMPICH recently added a few extensions to address the root issues in
82809cc9507SBarry SmithMPI+Thread. The first extension, MPIX stream, allows applications to
82909cc9507SBarry Smithexplicitly pass the thread context into MPI. The second extension, thread
83009cc9507SBarry Smithcommunicator, allows individual threads in an OpenMP parallel region to use
83109cc9507SBarry SmithMPI for inter-thread communications. In particular, this allows an OpenMP
83209cc9507SBarry Smithprogram to use PETSc within a parallel region.
83309cc9507SBarry Smith
83409cc9507SBarry SmithInstead of MPI+Thread, we refer to this new pattern as MPI x Thread.
83509cc9507SBarry Smith:::
83609cc9507SBarry Smith
83709cc9507SBarry Smith(junchao-zhang)=
83809cc9507SBarry Smith
83909cc9507SBarry Smith:::{topic} **PETSc on the GPU**
84009cc9507SBarry Smith**Junchao Zhang**
84109cc9507SBarry Smith
84209cc9507SBarry SmithArgonne National Laboratory
84309cc9507SBarry Smith
84409cc9507SBarry SmithIn this mini-tutorial, we will briefly introduce the GPU backends of PETSc and how to configure, build, run
84509cc9507SBarry Smithand profile PETSc on GPUs. We also talk about how to port your PETSc code to GPUs.
84609cc9507SBarry Smith:::
84709cc9507SBarry Smith
84809cc9507SBarry Smith(hong-zhang-mr)=
84909cc9507SBarry Smith
85009cc9507SBarry Smith:::{topic} **PETSc and PyTorch Interoperability**
85109cc9507SBarry Smith**Hong Zhang (Mr.)**
85209cc9507SBarry Smith
85309cc9507SBarry SmithArgonne National Laboratory
85409cc9507SBarry Smith
85509cc9507SBarry SmithIn this mini-tutorial, we will introduce: How to convert between PETSc vectors/matrices and PyTorch tensors;
85609cc9507SBarry SmithHow to generate Jacobian or action of Jacobian with PyTorch and use it in PETSc; How to use PETSc and PyTorch
85709cc9507SBarry Smithfor solving ODEs and training neural ODEs.
85809cc9507SBarry Smith:::
85909cc9507SBarry Smith
86009cc9507SBarry Smith(stefano-zampini)=
86109cc9507SBarry Smith
86209cc9507SBarry Smith:::{topic} **petsc4py**
86309cc9507SBarry Smith**Stefano Zampini**
86409cc9507SBarry Smith
86509cc9507SBarry SmithKing Abdullah University of Science and Technology (KAUST)
86609cc9507SBarry Smith
86709cc9507SBarry SmithIn this mini-tutorial, we will introduce the Python binding of PETSc.
86809cc9507SBarry Smith:::
86909cc9507SBarry Smith
87009cc9507SBarry Smith(matt-knepley)=
87109cc9507SBarry Smith
87209cc9507SBarry Smith:::{topic} **DMPlex**
87309cc9507SBarry Smith**Matt Knepley**
87409cc9507SBarry Smith
87509cc9507SBarry SmithUniversity at Buffalo
87609cc9507SBarry Smith
87709cc9507SBarry SmithIn this mini-tutorial, we will introduce the DMPlex class in PETSc.
87809cc9507SBarry Smith:::
87909cc9507SBarry Smith
88009cc9507SBarry Smith(id2)=
88109cc9507SBarry Smith
88209cc9507SBarry Smith:::{topic} **DMSwarm**
88309cc9507SBarry Smith**Joseph Pusztay**
88409cc9507SBarry Smith
88509cc9507SBarry SmithUniversity at Buffalo
88609cc9507SBarry Smith
88709cc9507SBarry SmithIn this mini-tutorial, we will introduce the DMSwarm class in PETSc.
88809cc9507SBarry Smith:::
88909cc9507SBarry Smith
89009cc9507SBarry Smith[c_04]: https://petsc.gitlab.io/annual-meetings/2023/slides/HongZhangMr.ipynb
89109cc9507SBarry Smith[s_00]: https://petsc.gitlab.io/annual-meetings/2023/tutorials/petsc_annual_meeting_2023_tutorial.pdf
89209cc9507SBarry Smith[s_01]: https://petsc.gitlab.io/annual-meetings/2023/slides/BarrySmith.pdf
89309cc9507SBarry Smith[s_02]: https://petsc.gitlab.io/annual-meetings/2023/slides/SaraCalandrini.pdf
89409cc9507SBarry Smith[s_03]: https://petsc.gitlab.io/annual-meetings/2023/slides/BerendvanWachem.pdf
89509cc9507SBarry Smith[s_04]: https://petsc.gitlab.io/annual-meetings/2023/slides/HongZhangMr.pdf
89609cc9507SBarry Smith[s_06]: https://petsc.gitlab.io/annual-meetings/2023/slides/DavidMay.pdf
89709cc9507SBarry Smith[s_07]: https://petsc.gitlab.io/annual-meetings/2023/slides/TimSteinhoff.pdf
89809cc9507SBarry Smith[s_08]: https://petsc.gitlab.io/annual-meetings/2023/slides/DerekGaston.pdf
89909cc9507SBarry Smith[s_09]: https://petsc.gitlab.io/annual-meetings/2023/slides/HeehoPark.pdf
90009cc9507SBarry Smith[s_10]: https://petsc.gitlab.io/annual-meetings/2023/slides/JoseERoman.pdf
90109cc9507SBarry Smith[s_11]: https://petsc.gitlab.io/annual-meetings/2023/slides/HuiZhou.pdf
90209cc9507SBarry Smith[s_12]: https://petsc.gitlab.io/annual-meetings/2023/slides/JunchaoZhang.pdf
90309cc9507SBarry Smith[s_13]: https://petsc.gitlab.io/annual-meetings/2023/slides/JustinChang.pdf
90409cc9507SBarry Smith[s_14]: https://petsc.gitlab.io/annual-meetings/2023/slides/StefanoZampini.pdf
90509cc9507SBarry Smith[s_15]: https://petsc.gitlab.io/annual-meetings/2023/slides/JacobFaibussowitsch.pdf
90609cc9507SBarry Smith[s_16]: https://petsc.gitlab.io/annual-meetings/2023/slides/LucBerger-Vergiat.pdf
90709cc9507SBarry Smith[s_17]: https://petsc.gitlab.io/annual-meetings/2023/slides/SpencerPatty.pdf
90809cc9507SBarry Smith[s_20]: https://petsc.gitlab.io/annual-meetings/2023/slides/ZakariaeJorti.pdf
90909cc9507SBarry Smith[s_21]: https://petsc.gitlab.io/annual-meetings/2023/slides/AlexGrant.pdf
91009cc9507SBarry Smith[s_22]: https://petsc.gitlab.io/annual-meetings/2023/slides/MohamadIbrahimCheikh.pdf
91109cc9507SBarry Smith[s_23]: https://petsc.gitlab.io/annual-meetings/2023/slides/HongZhangMs.pdf
91209cc9507SBarry Smith[s_24]: https://petsc.gitlab.io/annual-meetings/2023/slides/ChonglinZhang.pdf
91309cc9507SBarry Smith[s_25]: https://petsc.gitlab.io/annual-meetings/2023/slides/DanielFinn.pdf
91409cc9507SBarry Smith[s_26]: https://petsc.gitlab.io/annual-meetings/2023/slides/JosephPusztay.pdf
91509cc9507SBarry Smith[s_27]: https://petsc.gitlab.io/annual-meetings/2023/slides/JosephPusztayDMSwarm.pdf
91609cc9507SBarry Smith[s_28]: https://petsc.gitlab.io/annual-meetings/2023/slides/AidanHamilton.pdf
91709cc9507SBarry Smith[s_29]: https://petsc.gitlab.io/annual-meetings/2023/slides/MariaVasilyeva.pdf
91809cc9507SBarry Smith[s_30]: https://petsc.gitlab.io/annual-meetings/2023/slides/HansolSuh.pdf
91909cc9507SBarry Smith[s_31]: https://petsc.gitlab.io/annual-meetings/2023/slides/MattYoung.pdf
92009cc9507SBarry Smith[s_32]: https://petsc.gitlab.io/annual-meetings/2023/slides/BlaiseBourdin.pdf
92109cc9507SBarry Smith[s_33]: https://petsc.gitlab.io/annual-meetings/2023/slides/JakubKruzik.pdf
92209cc9507SBarry Smith[s_34]: https://petsc.gitlab.io/annual-meetings/2023/slides/MarekPecha.pdf
92309cc9507SBarry Smith[v_00]: https://youtu.be/rm34jR-p0xk
92409cc9507SBarry Smith[v_01]: https://youtu.be/vqx6b3Hg_6k
92509cc9507SBarry Smith[v_02]: https://youtu.be/pca0jT86qxU
92609cc9507SBarry Smith[v_03]: https://youtu.be/obdKq9SBpfw
92709cc9507SBarry Smith[v_04]: https://youtu.be/r_icrhAbmSQ
92809cc9507SBarry Smith[v_06]: https://youtu.be/0BplD93cSe8
92909cc9507SBarry Smith[v_07]: https://youtu.be/vENWhqp7XlI
93009cc9507SBarry Smith[v_08]: https://youtu.be/aHL4FIu_q6k
93109cc9507SBarry Smith[v_10]: https://youtu.be/2qhtMsvYw4o
93209cc9507SBarry Smith[v_11]: https://youtu.be/plfB7XVoqSQ
93309cc9507SBarry Smith[v_12]: https://youtu.be/8tmswLh3ez0
93409cc9507SBarry Smith[v_14]: https://youtu.be/hhe0Se4pkSg
93509cc9507SBarry Smith[v_15]: https://youtu.be/IbjboeTYuAE
93609cc9507SBarry Smith[v_17]: https://youtu.be/Baz4GVp4gQc
93709cc9507SBarry Smith[v_18]: https://youtu.be/jURFyoONRko
93809cc9507SBarry Smith[v_20]: https://youtu.be/k8PozEb4q40
93909cc9507SBarry Smith[v_21]: https://youtu.be/0L9boKxXPmA
94009cc9507SBarry Smith[v_22]: https://youtu.be/e101L03bO8A
94109cc9507SBarry Smith[v_23]: https://youtu.be/heWln8ZIrHc
94209cc9507SBarry Smith[v_24]: https://youtu.be/sGP_9JStYR8
94309cc9507SBarry Smith[v_25]: https://youtu.be/b-V_j4Vs2OA
94409cc9507SBarry Smith[v_26]: https://youtu.be/b-V_j4Vs2OA?t=1200
94509cc9507SBarry Smith[v_27]: https://youtu.be/FaAVV8-lnZI
94609cc9507SBarry Smith[v_28]: https://youtu.be/Ys0CZLha1pA
94709cc9507SBarry Smith[v_29]: https://youtu.be/Br-9WgvPG7Q
94809cc9507SBarry Smith[v_30]: https://youtu.be/8WvZ9ggB3x0
94909cc9507SBarry Smith[v_31]: https://youtu.be/hS3nOmX_g8I
95009cc9507SBarry Smith[v_32]: https://youtu.be/mfdmVbHsYK0
95109cc9507SBarry Smith[v_33]: https://youtu.be/2dC_NkGBBnE
95209cc9507SBarry Smith[v_34]: https://youtu.be/2dC_NkGBBnE?t=1194
953