Objet : Developers list for StarPU
Archives de la liste
- From: Xavier Lacoste <xl64100@gmail.com>
- To: Xavier Lacoste <xl64100@gmail.com>
- Cc: starpu-devel@lists.gforge.inria.fr
- Subject: Re: [Starpu-devel] StarPU MPI and Dags display
- Date: Thu, 27 Feb 2014 16:59:33 +0100
- List-archive: <http://lists.gforge.inria.fr/pipermail/starpu-devel>
- List-id: "Developers list. For discussion of new features, code changes, etc." <starpu-devel.lists.gforge.inria.fr>
Ok, It seem I didn't submit all my graph some GEMMs are missing, this may
explain both the wrong results and the strange dag.dot graph...
Sorry,
XL.
Le 27 févr. 2014 à 16:09, Xavier Lacoste <xl64100@gmail.com> a écrit :
> Hello,
>
> I must do something wrong with my algorithme it does not give the right
> solution with StarPU + MPI.
>
> In my app i'm first submitting Task that will update local data using
> remote data (GEMM).
>
> Then I follow the local algorithm, submitting both tasks using only local
> data (HETRF_TRSM + GEMM) and tasks updating remote data using local data
> (GEMM).
>
> When I look at the generated DAG I get a DAG that looks correct on one
> processor.
> https://dl.dropboxusercontent.com/u/2234149/DAG_STARPU/DAG_OK.pdf
>
> On two processors, the tasks from the "halo", submitted at the beginning
> appears on top of the graph where they should follow some HETRF_TRSM (a
> GEMM always depends on a HETRF_TRSM) :
> https://dl.dropboxusercontent.com/u/2234149/DAG_STARPU/DAG_KO.pdf
>
> I guess the GEMM task at the beginning correspond to the 21 halo tasks
> submitted at first step by proc 0 (On this case only data from proc 1 are
> used to update data on proc 0, not the contrary)....
> The same tasks are submitted on proc 1 in the second submission part and
> depends on HETRF_TRSM tasks (as I do nested task submission, HETRF_TRSM
> should have even been run before GEMM are submitted)....
>
> Are the dot graph reliable with MPI ?
>
> My algorithm looks like this :
>
> For each local column block c1
> For each column block c2 used for updating c1
> if c2 is not local submit_task(GEMM, c2, c1)
>
> For each local column block c1
> if (contribution(c1) == 0)
> submit(HETRF_TRSM, c1)
>
> HETRF_TRSM(c1) {
>
> COMPUTATION
>
> for each column block c2 updated by c1
> submit_task(GEMM, c1, c2)
> }
>
> GEMM(c1, c2) {
>
> COMPUTATION
>
> contribution(c1)--;
> If (contribution(c1)==0)
> submit(HETRF_TRSM,c1);
> }
>
> Does something looks wrong for you in this ?
>
> (Here GEMM(...,c1) are not COMMUTABLE (because the STARPU_COMMUTE was not
> available in starpu_mpi_task_build()) but they could be.)
>
> Thanks,
>
> XL.
> _______________________________________________
> Starpu-devel mailing list
> Starpu-devel@lists.gforge.inria.fr
> http://lists.gforge.inria.fr/cgi-bin/mailman/listinfo/starpu-devel
- [Starpu-devel] StarPU MPI and Dags display, Xavier Lacoste, 27/02/2014
- Re: [Starpu-devel] StarPU MPI and Dags display, Xavier Lacoste, 27/02/2014
Archives gérées par MHonArc 2.6.19+.