Objet : Developers list for StarPU
Archives de la liste
- From: Xavier Lacoste <xl64100@gmail.com>
- To: starpu-devel@lists.gforge.inria.fr
- Subject: [Starpu-devel] StarPU MPI and Dags display
- Date: Thu, 27 Feb 2014 16:09:20 +0100
- List-archive: <http://lists.gforge.inria.fr/pipermail/starpu-devel>
- List-id: "Developers list. For discussion of new features, code changes, etc." <starpu-devel.lists.gforge.inria.fr>
Hello,
I must do something wrong with my algorithme it does not give the right
solution with StarPU + MPI.
In my app i'm first submitting Task that will update local data using remote
data (GEMM).
Then I follow the local algorithm, submitting both tasks using only local
data (HETRF_TRSM + GEMM) and tasks updating remote data using local data
(GEMM).
When I look at the generated DAG I get a DAG that looks correct on one
processor.
https://dl.dropboxusercontent.com/u/2234149/DAG_STARPU/DAG_OK.pdf
On two processors, the tasks from the "halo", submitted at the beginning
appears on top of the graph where they should follow some HETRF_TRSM (a GEMM
always depends on a HETRF_TRSM) :
https://dl.dropboxusercontent.com/u/2234149/DAG_STARPU/DAG_KO.pdf
I guess the GEMM task at the beginning correspond to the 21 halo tasks
submitted at first step by proc 0 (On this case only data from proc 1 are
used to update data on proc 0, not the contrary)....
The same tasks are submitted on proc 1 in the second submission part and
depends on HETRF_TRSM tasks (as I do nested task submission, HETRF_TRSM
should have even been run before GEMM are submitted)....
Are the dot graph reliable with MPI ?
My algorithm looks like this :
For each local column block c1
For each column block c2 used for updating c1
if c2 is not local submit_task(GEMM, c2, c1)
For each local column block c1
if (contribution(c1) == 0)
submit(HETRF_TRSM, c1)
HETRF_TRSM(c1) {
COMPUTATION
for each column block c2 updated by c1
submit_task(GEMM, c1, c2)
}
GEMM(c1, c2) {
COMPUTATION
contribution(c1)--;
If (contribution(c1)==0)
submit(HETRF_TRSM,c1);
}
Does something looks wrong for you in this ?
(Here GEMM(...,c1) are not COMMUTABLE (because the STARPU_COMMUTE was not
available in starpu_mpi_task_build()) but they could be.)
Thanks,
XL.
- [Starpu-devel] StarPU MPI and Dags display, Xavier Lacoste, 27/02/2014
- Re: [Starpu-devel] StarPU MPI and Dags display, Xavier Lacoste, 27/02/2014
Archives gérées par MHonArc 2.6.19+.