Currently I'm working in desing and implement a software who has to implement CRUD operations over two tables with master-detail arquitecture. Header has about half million of rows and detail about million of rows.
Fill all this data in a dataset is crazy, also data can change and I'm not interested in have a local copy of database. I'm interested in that software works fluently. Although dataset may be not the best solution, I should use this to be consistent with other software parts.
First I think to use a TypedDataset and some methos like GetNext() , GetFirst() , GetByCod() but I'm not sure if is the best solution.... I'm doing a little test and don't work very fluently.
I'm interested in know how other developers do this , Best practices and what's "the best choice" to do operations with large data.
I'm using Visual Studio 2008 and Sql Server 2005.
ADDED: When you talk about of using SqlDataReader you're referring something like this ?
using (SqlConnection con = new SqlConnection(CON)) {
con.Open();
SqlCommand cmd = new SqlCommand("SELECT * FROM TABLE");
cmd.Connection = con;
SqlDataReader rd = cmd.ExecuteReader();
BindingSource bindingSource = new BindingSource();
bindingSource.DataSource = rd;
bindingNavigator1.BindingSource = bindingSource;
txtFCOD.DataBindings.Add("Text", bindingSource, "FIELD");
}