Subject: addendem to review of Matsui and Okuda From: Dave Yuen Date: Thu, 28 Jun 2001 14:49:23 -0500 (CDT) To: fox@csit.fsu.edu an additional paper by Schmalzl and Hansen ,Schmalzl, J. and U. Hansen, A fully implicit model for simulating dynamo action in a Cartesian domain, Phys. Earth Planet. Inter., 120, 339-349, 2000. should be rferened in regard to their usage of the finite-volume method for discretization as opposed to finite-elements. An advantage of the finite-volume method is a lower overhead in the memory requirements unlike finite-elements which need more memory. number of elements and GFlops attained for how many processors. These are important information which should hit the reader's eye at first reading. (2.) some reference to previous work on finite-elements on the sphere by Baumgardner and Frederikson , S.I.A.M., 1984 , Stuhne and Peltier, J. Computational Physics, 1999. and issues involving spectral-transform over the sphere ( Lesur and Gubbins, Geophys. J. International, 1999). Also finite-differences over spherical shell Fornberg and Merrill, Comparison of finite-difference- and pseudospectral methods for convective flow over a sphere, Geophys. Res. Lett., 24, 3245-3248, 1997. The authors should discuss the relative merits of finite-element methods in treating the singular polar problem. This may be a distinct advantage of finite-elements in using integration by parts to get rid of the singularity. They should look definitely into the icosehedral method by Stuhne and Peltier ( 1996, 1999, both in J. Computational Physics ) in this connection. (3) some more discussions about future prospects of this method in relationship to the growth of computational power in the near future ( next 2 years ). ================================== signed Dave Yuen o differences in the spatial resolution, radial resolution, and initial temperatures in the two simulations. Furthermore, the authors mention that the radial resolution in the FEM formulation can cause serious problems and require a smaller time step in this case making it computationally more expensive. While the paper is interesting and appears technically sound, I have some concerns about the contributions and suitability of the paper given the focus of the journal and those of the special issues (as listed at http://www.quakes.uq.edu.au/ACES/WG/WG5/index.html) The paper does not address the parallel formulation of the problem, its parallel FEM implementation or the issues/requirements/challenges that arise when moving this class of problems to massively parallel platforms. It is only mentioned that the implementation uses the GeoFEM package and relies on this package to support parallelism. The parallel runtimes listed are for 32 processors (not massively parallel) and show a scalability of about 2.7 when moving from 8 to 32 processor. Furthermore, the results indicate that the FEM simulation is at least an order of magnitude more expensive than the spectral method simulation. This is attributed to the smaller time step that is required for the FEM case, and also to the inefficiency of the parallel FEM code (attaining only 5% of peak). The authors do mention that the FEM formulation is new. The overall paper requires some effort to read and understand and has many grammatical and spelling errors. A few of these are listed below. I would suggest that a paper that focuses on the requirements/challenges for a parallel formulation of the problem and possibly an investigation of the overheads and lack of scalability of the current GeoFEM based implementation would be more appropriate for this journal. Presentation Changes: The paper has many grammatical errors that make it quite difficult to read. A few typos are listed below 1. Page 1, section 1, line6 should be Furthermore, the dynamo process is not only a complicated nonlinear system 2. Page 1, section 1, last line should be Most of these simulations, however, have applied the. 3. Page 2, section 1, line 4 should be . the finite difference method is applied, however they have considered 4. Page 2, section 1, line 12 should be . rotating spherical shell on massively parallel computers. 5. Page 2, section 1, line 21 should be convection column which are parallel. 6. Page 2, section 1, line 22 should be the columns propagate westward in both cases 7. Page 2, section 2, line 4 should be . and rotates with a uniform angular. 8. Page 5, section 2, line 9 should be each triangle is divided into three 9. Page 6, section 2, line 11 should be a large truncation error 10. Page 6, section 3, line 12 should be which are parallel 11. Page 7, section 3, line 3 should be Z-component 12. Page 7, section 3, line 6 should be position of each convection column 13. Page 9, section 3, line 12 should be is estimated by the phase 14. Page 9, section 3, line 15 should be the convection pattern propagates rapidly in the case of GeoFEM 15. Page 10, section 4, line 2 Table number missing. 16. Page 10, section 4, line 3 should be a 10 % difference is seen in several values in the results 17. Page 10, section 4, line 5 remove In these differences, 18. Page 10, section 4, line 6 should be around both the boundaries 19. Page 12, units for column 3 missing in table 2. 9۠jcq#\zWUѡzno(6_HlKلڥO{beCyUӑ}g q&Di E3`;?5'tT`Lm. eF$ kOVެ*=WР{dY7!&D+;-ioˈlpX1]3jR]N<թGL|> endobj 94 0 obj << /Type /FontDescriptor /Ascent 699 /CapHeight 662 /Descent -217 /Flags 34 /FontBBox [ -168 -218 1000 898 ] /FontName /Times-Roman /ItalicAngle 0 /StemV 84 /XHeight 450 >> endobj 95 0 obj << /Type /Encoding /BaseEncoding /WinAnsiEncoding /Differences [ 19 /Lslash /lslash /minus /fraction /breve /caron /dotlessi /dotaccent /hungarumlaut /ogonek /ring /fi /fl ] >> endobj 96 0 obj << /Type /Encoding /Differences [ 1 /radicalBig /parenleftBigg /parenrightBigg /parenleftbigg /parenrightbigg /braceleftBig /parenleftBig /parenrightBig /bracerightBig /summationdisplay /integraldisplay /bracketlefttp /bracketleftbt /bracketrighttp /bracketrightbt /contintegraldisplay /integraltext ] >> endobj 97 0 obj << /Filter /FlateDecode /Length 239 >> stream