Insert into database from an Excel file only rows that don't exist, using C#, EF, and SQL Server










0















I read an Excel file and insert that data into a database table, but every time I do this, it adds the existing rows plus the new data, I only want to insert the rows that aren't already in the table, my unique ID is the current time stamp.



For example, this is what happens currently when I do the first insert:



ExcelFile Database Table

a | b | date a | b | date
----------- ---------------
1 | 1 | 2018/02/12 1 | 1 | 2018/02/12
2 | 2 | 2018 /03/12 2 | 2 | 2018 /03/12


This happens when I do the second insert:



ExcelFile Database Table

a | b | date a | b | date
----------- ---------------
1 | 1 | 2018/02/12 1 | 1 | 2018/02/12
2 | 2 | 2018 /03/12 2 | 2 | 2018 /03/12
3 | 3 | 2018 /04/12 1 | 1 | 2018/02/12
2 | 2 | 2018 /03/12
3 | 3 | 2018 /04/12


I use Entity Framework to perform this and the ExcelDataReader package:



var result = reader.AsDataSet();

DataTable dt = new DataTable();
dt = result.Tables[0]; // here I store the data from the Excel file

foreach (DataRow row in dt.Rows)

using (AppContext context = new AppContext())

Data data = new Data();
string date = row.ItemArray[4].ToString();
DateTime parseDate = DateTime.Parse(date);
Data datos = new Data

a = row.ItemArray[0].ToString(),
b = row.ItemArray[1].ToString(),
c = row.ItemArray[2].ToString(),
d = row.ItemArray[3].ToString(),
e = parseDate
;
context.Data.Add(datos);
context.SaveChanges();




Is there a way to filter the excel file or compare them?



I'm all ears.










share|improve this question



















  • 2





    Is there a reason why you are using the Date as the unique Id?

    – Jimenemex
    Nov 13 '18 at 21:54











  • Do you have an option to update the Excel workbook and mark values that have been processed by adding a value in another column? That would be ideal. Otherwise you would need to check every single value against the database every time. Even better than marking them, perhaps move them to a different sheet. Or even delete them.

    – Scott Hannen
    Nov 14 '18 at 1:38











  • @Jimenemex yes, the excel file is autogenerated and the only value that is unique for every row is the date, and also I can't touch that original excel file, I had to create a method to perform a copy every few minutes and work with that copy.

    – HiramL.
    Nov 14 '18 at 2:59











  • @mjwills on the excel file the unique PK could be the Date and Hour, that is the only cell different for every row, on the other hand, my DB table do has a PK :)

    – HiramL.
    Nov 14 '18 at 3:02











  • @ScottHannen that sound like a lot of extra steps, we had this Idea but was rejected because of the app performance, anyway this was simply resolved by MikeH storing the values of the date and checking for an existing row with the same Datetime

    – HiramL.
    Nov 14 '18 at 3:06















0















I read an Excel file and insert that data into a database table, but every time I do this, it adds the existing rows plus the new data, I only want to insert the rows that aren't already in the table, my unique ID is the current time stamp.



For example, this is what happens currently when I do the first insert:



ExcelFile Database Table

a | b | date a | b | date
----------- ---------------
1 | 1 | 2018/02/12 1 | 1 | 2018/02/12
2 | 2 | 2018 /03/12 2 | 2 | 2018 /03/12


This happens when I do the second insert:



ExcelFile Database Table

a | b | date a | b | date
----------- ---------------
1 | 1 | 2018/02/12 1 | 1 | 2018/02/12
2 | 2 | 2018 /03/12 2 | 2 | 2018 /03/12
3 | 3 | 2018 /04/12 1 | 1 | 2018/02/12
2 | 2 | 2018 /03/12
3 | 3 | 2018 /04/12


I use Entity Framework to perform this and the ExcelDataReader package:



var result = reader.AsDataSet();

DataTable dt = new DataTable();
dt = result.Tables[0]; // here I store the data from the Excel file

foreach (DataRow row in dt.Rows)

using (AppContext context = new AppContext())

Data data = new Data();
string date = row.ItemArray[4].ToString();
DateTime parseDate = DateTime.Parse(date);
Data datos = new Data

a = row.ItemArray[0].ToString(),
b = row.ItemArray[1].ToString(),
c = row.ItemArray[2].ToString(),
d = row.ItemArray[3].ToString(),
e = parseDate
;
context.Data.Add(datos);
context.SaveChanges();




Is there a way to filter the excel file or compare them?



I'm all ears.










share|improve this question



















  • 2





    Is there a reason why you are using the Date as the unique Id?

    – Jimenemex
    Nov 13 '18 at 21:54











  • Do you have an option to update the Excel workbook and mark values that have been processed by adding a value in another column? That would be ideal. Otherwise you would need to check every single value against the database every time. Even better than marking them, perhaps move them to a different sheet. Or even delete them.

    – Scott Hannen
    Nov 14 '18 at 1:38











  • @Jimenemex yes, the excel file is autogenerated and the only value that is unique for every row is the date, and also I can't touch that original excel file, I had to create a method to perform a copy every few minutes and work with that copy.

    – HiramL.
    Nov 14 '18 at 2:59











  • @mjwills on the excel file the unique PK could be the Date and Hour, that is the only cell different for every row, on the other hand, my DB table do has a PK :)

    – HiramL.
    Nov 14 '18 at 3:02











  • @ScottHannen that sound like a lot of extra steps, we had this Idea but was rejected because of the app performance, anyway this was simply resolved by MikeH storing the values of the date and checking for an existing row with the same Datetime

    – HiramL.
    Nov 14 '18 at 3:06













0












0








0








I read an Excel file and insert that data into a database table, but every time I do this, it adds the existing rows plus the new data, I only want to insert the rows that aren't already in the table, my unique ID is the current time stamp.



For example, this is what happens currently when I do the first insert:



ExcelFile Database Table

a | b | date a | b | date
----------- ---------------
1 | 1 | 2018/02/12 1 | 1 | 2018/02/12
2 | 2 | 2018 /03/12 2 | 2 | 2018 /03/12


This happens when I do the second insert:



ExcelFile Database Table

a | b | date a | b | date
----------- ---------------
1 | 1 | 2018/02/12 1 | 1 | 2018/02/12
2 | 2 | 2018 /03/12 2 | 2 | 2018 /03/12
3 | 3 | 2018 /04/12 1 | 1 | 2018/02/12
2 | 2 | 2018 /03/12
3 | 3 | 2018 /04/12


I use Entity Framework to perform this and the ExcelDataReader package:



var result = reader.AsDataSet();

DataTable dt = new DataTable();
dt = result.Tables[0]; // here I store the data from the Excel file

foreach (DataRow row in dt.Rows)

using (AppContext context = new AppContext())

Data data = new Data();
string date = row.ItemArray[4].ToString();
DateTime parseDate = DateTime.Parse(date);
Data datos = new Data

a = row.ItemArray[0].ToString(),
b = row.ItemArray[1].ToString(),
c = row.ItemArray[2].ToString(),
d = row.ItemArray[3].ToString(),
e = parseDate
;
context.Data.Add(datos);
context.SaveChanges();




Is there a way to filter the excel file or compare them?



I'm all ears.










share|improve this question
















I read an Excel file and insert that data into a database table, but every time I do this, it adds the existing rows plus the new data, I only want to insert the rows that aren't already in the table, my unique ID is the current time stamp.



For example, this is what happens currently when I do the first insert:



ExcelFile Database Table

a | b | date a | b | date
----------- ---------------
1 | 1 | 2018/02/12 1 | 1 | 2018/02/12
2 | 2 | 2018 /03/12 2 | 2 | 2018 /03/12


This happens when I do the second insert:



ExcelFile Database Table

a | b | date a | b | date
----------- ---------------
1 | 1 | 2018/02/12 1 | 1 | 2018/02/12
2 | 2 | 2018 /03/12 2 | 2 | 2018 /03/12
3 | 3 | 2018 /04/12 1 | 1 | 2018/02/12
2 | 2 | 2018 /03/12
3 | 3 | 2018 /04/12


I use Entity Framework to perform this and the ExcelDataReader package:



var result = reader.AsDataSet();

DataTable dt = new DataTable();
dt = result.Tables[0]; // here I store the data from the Excel file

foreach (DataRow row in dt.Rows)

using (AppContext context = new AppContext())

Data data = new Data();
string date = row.ItemArray[4].ToString();
DateTime parseDate = DateTime.Parse(date);
Data datos = new Data

a = row.ItemArray[0].ToString(),
b = row.ItemArray[1].ToString(),
c = row.ItemArray[2].ToString(),
d = row.ItemArray[3].ToString(),
e = parseDate
;
context.Data.Add(datos);
context.SaveChanges();




Is there a way to filter the excel file or compare them?



I'm all ears.







c# sql-server excel entity-framework exceldatareader






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 13 '18 at 22:00









marc_s

573k12811071255




573k12811071255










asked Nov 13 '18 at 21:49









HiramL.HiramL.

34




34







  • 2





    Is there a reason why you are using the Date as the unique Id?

    – Jimenemex
    Nov 13 '18 at 21:54











  • Do you have an option to update the Excel workbook and mark values that have been processed by adding a value in another column? That would be ideal. Otherwise you would need to check every single value against the database every time. Even better than marking them, perhaps move them to a different sheet. Or even delete them.

    – Scott Hannen
    Nov 14 '18 at 1:38











  • @Jimenemex yes, the excel file is autogenerated and the only value that is unique for every row is the date, and also I can't touch that original excel file, I had to create a method to perform a copy every few minutes and work with that copy.

    – HiramL.
    Nov 14 '18 at 2:59











  • @mjwills on the excel file the unique PK could be the Date and Hour, that is the only cell different for every row, on the other hand, my DB table do has a PK :)

    – HiramL.
    Nov 14 '18 at 3:02











  • @ScottHannen that sound like a lot of extra steps, we had this Idea but was rejected because of the app performance, anyway this was simply resolved by MikeH storing the values of the date and checking for an existing row with the same Datetime

    – HiramL.
    Nov 14 '18 at 3:06












  • 2





    Is there a reason why you are using the Date as the unique Id?

    – Jimenemex
    Nov 13 '18 at 21:54











  • Do you have an option to update the Excel workbook and mark values that have been processed by adding a value in another column? That would be ideal. Otherwise you would need to check every single value against the database every time. Even better than marking them, perhaps move them to a different sheet. Or even delete them.

    – Scott Hannen
    Nov 14 '18 at 1:38











  • @Jimenemex yes, the excel file is autogenerated and the only value that is unique for every row is the date, and also I can't touch that original excel file, I had to create a method to perform a copy every few minutes and work with that copy.

    – HiramL.
    Nov 14 '18 at 2:59











  • @mjwills on the excel file the unique PK could be the Date and Hour, that is the only cell different for every row, on the other hand, my DB table do has a PK :)

    – HiramL.
    Nov 14 '18 at 3:02











  • @ScottHannen that sound like a lot of extra steps, we had this Idea but was rejected because of the app performance, anyway this was simply resolved by MikeH storing the values of the date and checking for an existing row with the same Datetime

    – HiramL.
    Nov 14 '18 at 3:06







2




2





Is there a reason why you are using the Date as the unique Id?

– Jimenemex
Nov 13 '18 at 21:54





Is there a reason why you are using the Date as the unique Id?

– Jimenemex
Nov 13 '18 at 21:54













Do you have an option to update the Excel workbook and mark values that have been processed by adding a value in another column? That would be ideal. Otherwise you would need to check every single value against the database every time. Even better than marking them, perhaps move them to a different sheet. Or even delete them.

– Scott Hannen
Nov 14 '18 at 1:38





Do you have an option to update the Excel workbook and mark values that have been processed by adding a value in another column? That would be ideal. Otherwise you would need to check every single value against the database every time. Even better than marking them, perhaps move them to a different sheet. Or even delete them.

– Scott Hannen
Nov 14 '18 at 1:38













@Jimenemex yes, the excel file is autogenerated and the only value that is unique for every row is the date, and also I can't touch that original excel file, I had to create a method to perform a copy every few minutes and work with that copy.

– HiramL.
Nov 14 '18 at 2:59





@Jimenemex yes, the excel file is autogenerated and the only value that is unique for every row is the date, and also I can't touch that original excel file, I had to create a method to perform a copy every few minutes and work with that copy.

– HiramL.
Nov 14 '18 at 2:59













@mjwills on the excel file the unique PK could be the Date and Hour, that is the only cell different for every row, on the other hand, my DB table do has a PK :)

– HiramL.
Nov 14 '18 at 3:02





@mjwills on the excel file the unique PK could be the Date and Hour, that is the only cell different for every row, on the other hand, my DB table do has a PK :)

– HiramL.
Nov 14 '18 at 3:02













@ScottHannen that sound like a lot of extra steps, we had this Idea but was rejected because of the app performance, anyway this was simply resolved by MikeH storing the values of the date and checking for an existing row with the same Datetime

– HiramL.
Nov 14 '18 at 3:06





@ScottHannen that sound like a lot of extra steps, we had this Idea but was rejected because of the app performance, anyway this was simply resolved by MikeH storing the values of the date and checking for an existing row with the same Datetime

– HiramL.
Nov 14 '18 at 3:06












3 Answers
3






active

oldest

votes


















0














Check for an existing row before adding it. The below should be inserted below where you calculate parseDate.



var existingRow = context.Data.FirstOrDefault(d=>d.e == parseDate); //Note that the ".e" should refer to your "date" field
if (existingRow != null)

//This row already exists

else

//It doesn't exist, go ahead and add it






share|improve this answer























  • thanks man, this did the thing correctly and its very good in performance. I was trying to create a Stored procedure that could merge the excel datatable and the table in my Database but seemed kind of complicated since I am new to EF. Greetings Mike!

    – HiramL.
    Nov 14 '18 at 3:38


















0














If "a" is the PK on the table and unique across rows then I would check the existence of existing rows by ID before inserting. Similar to Mike's response, though one consideration is if the table has a number of columns I would avoid returning the entity, but rather just an exists check using .Any()



if (!context.Data.Any(x => x.a == row.a)
// insert the row as a new entity


The caveat here is if the excel file contains edits, existing rows where the data changes, this will not accommodate that.



For bulk import processes, I would typically approach these by first staging the excel data into a staging table first. (purging the staging table prior to each import) From there I would have entities mapped to staging tables, vs entities mapped to the "real" tables. If there is a "modified date" that can be extracted from the file for each record then I would also store the import date/time as a part of the application, so that when selecting the rows to import from the staging table, only get records where that modified date/time > the last import date/time. From there you can query data from the staging table in batches, and look for new records vs. existing modifications. I find querying entities on both side of the migration is more flexible than dealing with an in-memory block for the import. With small imports it may not be worthwhile, but for larger files where you will want to work with smaller sub-sets and filtering, it can make things easier.






share|improve this answer






























    0














    I could perform exactly what I needed with the help of @MikeH
    With this only the rows with different DateTime were added (the DateTime its always an ascending value in my case.)



    foreach (DataRow row in dt.Rows) // dt = my dataTable loaded with ExcelDataReader

    using (AppContext context = new AppContext())

    string date = row.ItemArray[4].ToString();
    DateTime parseDate = DateTime.Parse(date); // I did a parse because the column "e" only accepted DateTime and not String types.

    var existingRow = context.Data.FirstOrDefault(d => d.e == parseDate);
    if (existingRow != null)

    Console.WriteLine("Do Nothing");

    else

    Data datos = new Data

    a = row.ItemArray[0].ToString(),
    b = row.ItemArray[1].ToString(),
    c = row.ItemArray[2].ToString(),
    d = row.ItemArray[3].ToString(),
    e = parseDate
    ;
    context.Data.Add(datos);
    context.SaveChanges();








    share|improve this answer






















      Your Answer






      StackExchange.ifUsing("editor", function ()
      StackExchange.using("externalEditor", function ()
      StackExchange.using("snippets", function ()
      StackExchange.snippets.init();
      );
      );
      , "code-snippets");

      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "1"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader:
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      ,
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );













      draft saved

      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53290025%2finsert-into-database-from-an-excel-file-only-rows-that-dont-exist-using-c-ef%23new-answer', 'question_page');

      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      0














      Check for an existing row before adding it. The below should be inserted below where you calculate parseDate.



      var existingRow = context.Data.FirstOrDefault(d=>d.e == parseDate); //Note that the ".e" should refer to your "date" field
      if (existingRow != null)

      //This row already exists

      else

      //It doesn't exist, go ahead and add it






      share|improve this answer























      • thanks man, this did the thing correctly and its very good in performance. I was trying to create a Stored procedure that could merge the excel datatable and the table in my Database but seemed kind of complicated since I am new to EF. Greetings Mike!

        – HiramL.
        Nov 14 '18 at 3:38















      0














      Check for an existing row before adding it. The below should be inserted below where you calculate parseDate.



      var existingRow = context.Data.FirstOrDefault(d=>d.e == parseDate); //Note that the ".e" should refer to your "date" field
      if (existingRow != null)

      //This row already exists

      else

      //It doesn't exist, go ahead and add it






      share|improve this answer























      • thanks man, this did the thing correctly and its very good in performance. I was trying to create a Stored procedure that could merge the excel datatable and the table in my Database but seemed kind of complicated since I am new to EF. Greetings Mike!

        – HiramL.
        Nov 14 '18 at 3:38













      0












      0








      0







      Check for an existing row before adding it. The below should be inserted below where you calculate parseDate.



      var existingRow = context.Data.FirstOrDefault(d=>d.e == parseDate); //Note that the ".e" should refer to your "date" field
      if (existingRow != null)

      //This row already exists

      else

      //It doesn't exist, go ahead and add it






      share|improve this answer













      Check for an existing row before adding it. The below should be inserted below where you calculate parseDate.



      var existingRow = context.Data.FirstOrDefault(d=>d.e == parseDate); //Note that the ".e" should refer to your "date" field
      if (existingRow != null)

      //This row already exists

      else

      //It doesn't exist, go ahead and add it







      share|improve this answer












      share|improve this answer



      share|improve this answer










      answered Nov 13 '18 at 21:52









      MikeHMikeH

      3,258927




      3,258927












      • thanks man, this did the thing correctly and its very good in performance. I was trying to create a Stored procedure that could merge the excel datatable and the table in my Database but seemed kind of complicated since I am new to EF. Greetings Mike!

        – HiramL.
        Nov 14 '18 at 3:38

















      • thanks man, this did the thing correctly and its very good in performance. I was trying to create a Stored procedure that could merge the excel datatable and the table in my Database but seemed kind of complicated since I am new to EF. Greetings Mike!

        – HiramL.
        Nov 14 '18 at 3:38
















      thanks man, this did the thing correctly and its very good in performance. I was trying to create a Stored procedure that could merge the excel datatable and the table in my Database but seemed kind of complicated since I am new to EF. Greetings Mike!

      – HiramL.
      Nov 14 '18 at 3:38





      thanks man, this did the thing correctly and its very good in performance. I was trying to create a Stored procedure that could merge the excel datatable and the table in my Database but seemed kind of complicated since I am new to EF. Greetings Mike!

      – HiramL.
      Nov 14 '18 at 3:38













      0














      If "a" is the PK on the table and unique across rows then I would check the existence of existing rows by ID before inserting. Similar to Mike's response, though one consideration is if the table has a number of columns I would avoid returning the entity, but rather just an exists check using .Any()



      if (!context.Data.Any(x => x.a == row.a)
      // insert the row as a new entity


      The caveat here is if the excel file contains edits, existing rows where the data changes, this will not accommodate that.



      For bulk import processes, I would typically approach these by first staging the excel data into a staging table first. (purging the staging table prior to each import) From there I would have entities mapped to staging tables, vs entities mapped to the "real" tables. If there is a "modified date" that can be extracted from the file for each record then I would also store the import date/time as a part of the application, so that when selecting the rows to import from the staging table, only get records where that modified date/time > the last import date/time. From there you can query data from the staging table in batches, and look for new records vs. existing modifications. I find querying entities on both side of the migration is more flexible than dealing with an in-memory block for the import. With small imports it may not be worthwhile, but for larger files where you will want to work with smaller sub-sets and filtering, it can make things easier.






      share|improve this answer



























        0














        If "a" is the PK on the table and unique across rows then I would check the existence of existing rows by ID before inserting. Similar to Mike's response, though one consideration is if the table has a number of columns I would avoid returning the entity, but rather just an exists check using .Any()



        if (!context.Data.Any(x => x.a == row.a)
        // insert the row as a new entity


        The caveat here is if the excel file contains edits, existing rows where the data changes, this will not accommodate that.



        For bulk import processes, I would typically approach these by first staging the excel data into a staging table first. (purging the staging table prior to each import) From there I would have entities mapped to staging tables, vs entities mapped to the "real" tables. If there is a "modified date" that can be extracted from the file for each record then I would also store the import date/time as a part of the application, so that when selecting the rows to import from the staging table, only get records where that modified date/time > the last import date/time. From there you can query data from the staging table in batches, and look for new records vs. existing modifications. I find querying entities on both side of the migration is more flexible than dealing with an in-memory block for the import. With small imports it may not be worthwhile, but for larger files where you will want to work with smaller sub-sets and filtering, it can make things easier.






        share|improve this answer

























          0












          0








          0







          If "a" is the PK on the table and unique across rows then I would check the existence of existing rows by ID before inserting. Similar to Mike's response, though one consideration is if the table has a number of columns I would avoid returning the entity, but rather just an exists check using .Any()



          if (!context.Data.Any(x => x.a == row.a)
          // insert the row as a new entity


          The caveat here is if the excel file contains edits, existing rows where the data changes, this will not accommodate that.



          For bulk import processes, I would typically approach these by first staging the excel data into a staging table first. (purging the staging table prior to each import) From there I would have entities mapped to staging tables, vs entities mapped to the "real" tables. If there is a "modified date" that can be extracted from the file for each record then I would also store the import date/time as a part of the application, so that when selecting the rows to import from the staging table, only get records where that modified date/time > the last import date/time. From there you can query data from the staging table in batches, and look for new records vs. existing modifications. I find querying entities on both side of the migration is more flexible than dealing with an in-memory block for the import. With small imports it may not be worthwhile, but for larger files where you will want to work with smaller sub-sets and filtering, it can make things easier.






          share|improve this answer













          If "a" is the PK on the table and unique across rows then I would check the existence of existing rows by ID before inserting. Similar to Mike's response, though one consideration is if the table has a number of columns I would avoid returning the entity, but rather just an exists check using .Any()



          if (!context.Data.Any(x => x.a == row.a)
          // insert the row as a new entity


          The caveat here is if the excel file contains edits, existing rows where the data changes, this will not accommodate that.



          For bulk import processes, I would typically approach these by first staging the excel data into a staging table first. (purging the staging table prior to each import) From there I would have entities mapped to staging tables, vs entities mapped to the "real" tables. If there is a "modified date" that can be extracted from the file for each record then I would also store the import date/time as a part of the application, so that when selecting the rows to import from the staging table, only get records where that modified date/time > the last import date/time. From there you can query data from the staging table in batches, and look for new records vs. existing modifications. I find querying entities on both side of the migration is more flexible than dealing with an in-memory block for the import. With small imports it may not be worthwhile, but for larger files where you will want to work with smaller sub-sets and filtering, it can make things easier.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 14 '18 at 0:05









          Steve PySteve Py

          5,33511017




          5,33511017





















              0














              I could perform exactly what I needed with the help of @MikeH
              With this only the rows with different DateTime were added (the DateTime its always an ascending value in my case.)



              foreach (DataRow row in dt.Rows) // dt = my dataTable loaded with ExcelDataReader

              using (AppContext context = new AppContext())

              string date = row.ItemArray[4].ToString();
              DateTime parseDate = DateTime.Parse(date); // I did a parse because the column "e" only accepted DateTime and not String types.

              var existingRow = context.Data.FirstOrDefault(d => d.e == parseDate);
              if (existingRow != null)

              Console.WriteLine("Do Nothing");

              else

              Data datos = new Data

              a = row.ItemArray[0].ToString(),
              b = row.ItemArray[1].ToString(),
              c = row.ItemArray[2].ToString(),
              d = row.ItemArray[3].ToString(),
              e = parseDate
              ;
              context.Data.Add(datos);
              context.SaveChanges();








              share|improve this answer



























                0














                I could perform exactly what I needed with the help of @MikeH
                With this only the rows with different DateTime were added (the DateTime its always an ascending value in my case.)



                foreach (DataRow row in dt.Rows) // dt = my dataTable loaded with ExcelDataReader

                using (AppContext context = new AppContext())

                string date = row.ItemArray[4].ToString();
                DateTime parseDate = DateTime.Parse(date); // I did a parse because the column "e" only accepted DateTime and not String types.

                var existingRow = context.Data.FirstOrDefault(d => d.e == parseDate);
                if (existingRow != null)

                Console.WriteLine("Do Nothing");

                else

                Data datos = new Data

                a = row.ItemArray[0].ToString(),
                b = row.ItemArray[1].ToString(),
                c = row.ItemArray[2].ToString(),
                d = row.ItemArray[3].ToString(),
                e = parseDate
                ;
                context.Data.Add(datos);
                context.SaveChanges();








                share|improve this answer

























                  0












                  0








                  0







                  I could perform exactly what I needed with the help of @MikeH
                  With this only the rows with different DateTime were added (the DateTime its always an ascending value in my case.)



                  foreach (DataRow row in dt.Rows) // dt = my dataTable loaded with ExcelDataReader

                  using (AppContext context = new AppContext())

                  string date = row.ItemArray[4].ToString();
                  DateTime parseDate = DateTime.Parse(date); // I did a parse because the column "e" only accepted DateTime and not String types.

                  var existingRow = context.Data.FirstOrDefault(d => d.e == parseDate);
                  if (existingRow != null)

                  Console.WriteLine("Do Nothing");

                  else

                  Data datos = new Data

                  a = row.ItemArray[0].ToString(),
                  b = row.ItemArray[1].ToString(),
                  c = row.ItemArray[2].ToString(),
                  d = row.ItemArray[3].ToString(),
                  e = parseDate
                  ;
                  context.Data.Add(datos);
                  context.SaveChanges();








                  share|improve this answer













                  I could perform exactly what I needed with the help of @MikeH
                  With this only the rows with different DateTime were added (the DateTime its always an ascending value in my case.)



                  foreach (DataRow row in dt.Rows) // dt = my dataTable loaded with ExcelDataReader

                  using (AppContext context = new AppContext())

                  string date = row.ItemArray[4].ToString();
                  DateTime parseDate = DateTime.Parse(date); // I did a parse because the column "e" only accepted DateTime and not String types.

                  var existingRow = context.Data.FirstOrDefault(d => d.e == parseDate);
                  if (existingRow != null)

                  Console.WriteLine("Do Nothing");

                  else

                  Data datos = new Data

                  a = row.ItemArray[0].ToString(),
                  b = row.ItemArray[1].ToString(),
                  c = row.ItemArray[2].ToString(),
                  d = row.ItemArray[3].ToString(),
                  e = parseDate
                  ;
                  context.Data.Add(datos);
                  context.SaveChanges();









                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered Nov 14 '18 at 3:35









                  HiramL.HiramL.

                  34




                  34



























                      draft saved

                      draft discarded
















































                      Thanks for contributing an answer to Stack Overflow!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid


                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.

                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53290025%2finsert-into-database-from-an-excel-file-only-rows-that-dont-exist-using-c-ef%23new-answer', 'question_page');

                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Top Tejano songwriter Luis Silva dead of heart attack at 64

                      ReactJS Fetched API data displays live - need Data displayed static

                      政党