A serious application using MPI????


 
Thread Tools Search this Thread
Top Forums Programming A serious application using MPI????
# 1  
Old 03-25-2013
A serious application using MPI????

Hey friends,
I am very new to the world of Message Passing Interface(MP), and learning writing small programs with it on my personal cluster. I intend to do my final year project using MPI. Could you tell me what kind of an application one could develop which could be considered serious enough for a semester project?

Thanks for your time!!
# 2  
Old 03-25-2013
Well, to exploit parallelism, you can pipeline or multi-serve. Pipeline is doing parts on the problem in multiple modules that pull workd from one MPI and push product to another MPI. The multi-serve concept means dividing the problem in rotatio or by internal information, so when problem parts arrive at a dispatcher, it send them to different parallel, identical instances. It can be in rotation, on a queue depth basis when service suration is variable, or based on a message value, like stock symbol to drive a book and match engine, or two account digits to divide the flow into 100 streams. From what I saw of MPI, it lacks the transactional nature of MQ from IBM, which allowed restarts and supported RDBMS interfaces. And when you have scatter, you need gather, so MPI processes that collect data from many sources can merge them, perhaps in a binary tree of stream merges. MPI supports loosely coupled systems, and can be used to manage memory and process dispatching, where the dispatcher provides CPUs to threads emptying fuller queues and filling emptier queues. One idea I had was for a high speed sorting container, where the input was very parallel buffered for quick writing and the sorting was dynamically divided up into parallel streams followed by a merge tree. MPI can also be used to flow data to a deep buffer, at a point where file I/O can support data aggregation, to flow data into or out of RDBMS, and to flow data to and from files. Parallel flat file reading is not that difficult, and for fixed arrays of data into flat files, parallel writing is also possible. MPI supports heterogenous processing, so you could code in JAVA and run on a mix of platforms: PCs, MACs, SPARCs, etc. The mind boggles.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Programming

MPI C++ in a nested loop

I have a MPI program like this: void slave1(int j){ MPI_Status status; MPI_Recv(&j,1,MPI_INT,0,0,MPI_COMM_WORLD,&status);} void slave2(int j){ MPI_Status status; MPI_Recv(&j,1,MPI_INT,0,1,MPI_COMM_WORLD,&status);} int main(int argc, char *argv){ int numprocs, rank; ... (0 Replies)
Discussion started by: wanliushao
0 Replies

2. High Performance Computing

Installation of MPI in a cluster of SMPs

Hi, I've installed mpich2 v. 1.2.1p1 on a cluster of dual-processors with the default options (in previous versions I used 'ssm' device, but now I use 'nemesis'). I'd like that every time I execute a job (e.g. with 2 MPI-processes), each job's process be dispatched on a different machine... (0 Replies)
Discussion started by: Sonia_
0 Replies

3. High Performance Computing

MPI + Cluster SMP

Hola, he instalado mpich2 vs. 1.2.1p1 en un cluster de biprocesadores con las opciones por defecto (antes usaba ssm pero visto que se quedaba colgado, lo he dejado con nemesis). El caso es que quisiera que cada vez que lanzo un job (por ejemplo de 2 procesos), cada proceso del trabajo se fuera... (1 Reply)
Discussion started by: Sonia_
1 Replies

4. High Performance Computing

MPI - Error on sending argv

Hi all, I write a simple MPI program to send a text message to another process. The code is below. (test.c) #include "mpi.h" #include <stdio.h> #include <stdlib.h> #include <string.h> int main(int argc, char* argv) { int dest, noProcesses, processId; MPI_Status status; ... (0 Replies)
Discussion started by: awiles
0 Replies

5. High Performance Computing

MPI error message and other applicatioins

1st,I'm a newbie. I've written a mpi program to realize the parallel computing and encounter many problems. 1. When the computing scale is small, that means the communication time is short, just needs few minutes, such as 14 minutes or less. The program runs well and finished the jog. ... (0 Replies)
Discussion started by: mystline
0 Replies

6. High Performance Computing

MPI, recovering node

Hi all, I'm writing an MPI application, in which I handle failures and recover them. In order to do that, in case of one node failure, I would like to remove that node from the MPI_COMM_WORLD group and continue with the remaining nodes. Does anybody know how I can do that? I'm using... (5 Replies)
Discussion started by: SaTYR
5 Replies

7. UNIX for Dummies Questions & Answers

MPI in Rock cluster

hi, may i know how to run mpi after i had install the rock cluster? is there any guidelines or examples? (0 Replies)
Discussion started by: joannetan9984
0 Replies

8. Programming

help required for linux MPI

hi I am starting with MPI programming, can anyone suggest me some books for MPI or some good online tutorials which will cover both theory and programming. thanks (0 Replies)
Discussion started by: bhakti
0 Replies

9. Solaris

please help me about solaris mpi

Dear Sir./Madam, Thanks the help from some users here,I have downloaded solaris mpi software and installed into my SUN E3500 SUNOS5.8 system. I need more help. When I submit the parallel computing task,there is a error message "ld.so.1 : /workdir/computingsoft.x : fatal : libmpi.so.1 : open... (1 Reply)
Discussion started by: jingwp
1 Replies

10. Solaris

where can i download unix mpi ?

where can i download unix mpi? I have a sun workstation with solaris2.8 system and 8 cpus. I want to run parallel program but have no mpi software. Some people say there is unix mpi but i can not find it.Who can help me?Thanks.<removed> is my email. (2 Replies)
Discussion started by: jingwp
2 Replies
Login or Register to Ask a Question