[Getdp] Coupling with other codes || PETSc

Thomas Jung Thomas.Jung at iisb.fraunhofer.de
Tue Jul 4 10:54:06 CEST 2006


On Saturday 13 May 2006 23:28, Christophe Geuzaine wrote:
> Kubicek Bernhard wrote:
> > Hello!
> > First of all, I would like to thank you for getDP, which seems to be a
> > really nice tool, and especially for providing it as free software. This
> > is simply great!
> > A colleague of mine and me are working on a project wherein we want to
> > simulate a multiphysics problem involving numerical electromagnetism and
> > CFD. As far as I have concluded from the mailing list, getDP has no
> > Navier-Stokes solver yet, and therefore we want to couple getDP with an
> > external CFD code (FLUENT, in our case, which sadly seems not to have a
> > electromagnetic solver apart from the unaffordable ANSYS-MPCCI
> > combination). As we have a transient problem, this would mean to first
> > calculate a time step with the CFD code, and then pass some relevant data
> > to getDP, which after doing its time step returns some data for the next
> > CFD time step.
> > We are not 100% sure if we can calculate something useful this way, but
> > for small step sizes we hope to be lucky.
> >
> > However my first question concerns the passing of a scalar field to
> > getdp, i.E. a known temperature field.
> > This field needs to be used to calculate the conductivity for each
> > volume element. InterpolationLinear seems to not suitable for functions
> > with 3 parameters.
> > Is there some way of defining a field using a linear array and the
> > number of the momentary volume element?
> > Maybe something similar to this fantasy-code:
> > TemperatureField={0,0,0.1,0.2, .... } // for all finite volumes
> > T[]=Temperature[$NrMomentaryVolume]
>
> I think some of my colleagues in Liege do that kind of thing on a
> regular basis: I've CCed them to this reply. (Patrick?)
>
> If you use the same mesh in both codes you could simply write a .res
> file containing the temperatures (and define the corresponding function
> space in your .pro file). If the meshes are different, what I would
> probably do is to add a new routine in getdp to read a post-processing
> file (e.g. in Gmsh format) and use an octree to locate the data.
>
> > The second question is about the state of the PETSc implementation. In
> > an not so long ago post you state that PETSc is working, but only on a
> > single cpu. As a parallel solver could be very useful to us, it would be
> > great to have an rough idea in how many months this might be ready for,
> > well, lets call it "field"-testing.
>
> I'm not sure if getdp in its current incarnation will ever see a "real"
> parallel version: it's basically a matter of people interested in that
> stuff coding it up... (The problem is that it's not trivial to do this
> right, so that the parallelization is both efficient and "general".
> Maybe we'll have to wait for the next complete rewrite of the code.)
>
> > Apart from this questions, I have an half finished informal introduction
> > to the weak form, which only consists of a few pages in latex, written
> > for those who hesitate to read Bossavit. If you are interested in this,
> > and want to read/comment it before maybe spreading it to other people, I
> > could finish writing it up within the next two weeks.
> > Although, I have to admit it might not reach your mathematical standards.
>
> That would be great: you could post it to the mailing list and/or the wiki.
>
> Christophe
>
> > Thank you very much,
> > and greetings from cloudy Vienna,
> > Bernhard Kubicek
> > freelancer for arsenal vienna, ARC /PhD student Vienna University of
> > Technology
> >
> >
> > ------------------------------------------------------------------------
> >
> > _______________________________________________
> > getdp mailing list
> > getdp at geuz.org
> > http://www.geuz.org/mailman/listinfo/getdp



1.) Bernhard, I am trying to do the same - couple getdp (Lorentz-forces) and 
Fluent. Are you still on it ? Perhaps we can do something together ?

2.) I also think I will need a parallel implementation, and would invest some 
time in it. I have been looking at the code, and a lot of things seems to be 
done already. However its difficult for me to see where to start, what is 
missing. 
Christophe, or whoever did the current parallel implementation - could you 
tell us what is missing, where currently the problems are ?

-- 
Thomas Jung
Fraunhofer-Institut IISB
D-91058 Erlangen, Schottkystr. 10
+49 9131 761264