Forum Question: Apple numbers removing duplicate data

Hi does anyone know how to remove duplicate info inside of numbers?
I have a database list of names and phone numbers and I want to remove the duplicate names and numbers.
The only info I can find on how to do this in numbers is on this site. http://discussions.apple.com/thread.jspa?threadID=1479084
I’ve tried it several times with no luck, can someone else explain, how to do this to a numbers newbie?
—–
Matthew

Comments: 16 Responses to “Apple numbers removing duplicate data”

    1/11/11 @ 10:34 pm

    Here’s what I would do, providing the list isn’t that long (a few hundred or less). I would sort the list like the page suggests. Sort by name for instance. That should put each duplicate next to the other in the list.
    Then I would create a new column, perhaps right after the name column. I would just arrow-key down through that empty column, putting an X or 1 or something in that column next to the one you want to keep. Leave the other one blank.
    When you are done, simply sort the list again by that new column. That will put all the blank ones together and the ones with the X or 1 or whatever together. Then select the rows with the blanks and delete them all, leaving the rest.

    Jerry G
    1/12/11 @ 3:14 pm

    Matthew,

    As Gary said, adding a column to sort on and marking the duplicates is the way to go. You can automate the marking of the duplicates rather easily.

    Let’s assume that the names are in Column A, then this expression in your new column will mark each duplicate with an asterisk. After you sort on the new column all the duplicates will be grouped and you can delete that block of rows.

    Here’s the formula: =IF(COUNTIF(A,A)>1, “*”, “”)

    The formula needs to be in every cell of your new column. If the data you want to check is in some other column, say C, then in that case you change the As to Cs in that expression.

    Hope this helps.

    Jerry

      Kay S.
      10/28/11 @ 5:59 pm

      Thank you, Jerry!! It worked superbly for me: I was trying to identify multiple entries for the same person I may have in my Address Book. (First Name was in column A – so your formula worked like a charm!)

      SheilaK
      11/27/12 @ 9:46 am

      I have a list of 22,000 names — this is great for finding the duplicates, but how do I retain one unique record while deleting the duplicates? (This marks every instance–the first and subsequent occurrences–is there any way to retain the first occurrence while eliminating subsequent occurrences?)

        11/27/12 @ 10:46 am

        If you create it right, it shouldn’t mark every instance. Look at my original comment. You could create a formula that looks for a duplicate in the cell directly above. So the first isn’t marked, but the others are.

    Heather
    2/27/11 @ 3:01 am

    Hm… that’s great, but –
    What about if I have two different lists of email addresses?

    I know each list contains some email addresses they are the same. What I want is find exactly these email addresses highlight them and then strip them from one list.

    Then I can merge the two lists and have “one” list with all unique (once only) email addresses!

    Thanks for your help

      2/27/11 @ 10:03 am

      Merge them first, then remove the duplicates.

    Heather
    2/27/11 @ 7:31 pm

    Thanks Gary, but this does not do it!
    As a matter of fact, this would make it worse … think about for a moment what the question was about, then you will agree!

    Nevertheless, I found the answer somewhere else, thanks a lot

      2/27/11 @ 8:54 pm

      How? How would it make it worse? You want one list with each email address. I suggested merging them and then removing the duplicates. Seems to me that would solve the problem.

      RLBeemanq
      3/2/11 @ 12:52 pm

      Can you please post where you found this solution??

      I have the same issue.

      Thanks.

    JaxJason
    3/31/11 @ 7:58 am

    Not too bad with formulas really once you set this up, you can use t for any set of data to remove duplicates of one or more columns.

    Example:
    In the old table add a column. In the first row with data place the number 1. then in the second place the following equation (assuming Column A has your test values and you have one header row, Column D is your new column you just added):
    =if(a3a2,D2+1,D2)

    This will increment the counter every time a new item in Column A appears.

    Now in your new table in A2 place this:
    =iferror(offset(Table1::A1,match(Row()-1,Table1::D,0)-1,0,1,1),””)

    This will look for that rows number (-1) so row 2 looks for the first item with the number 1 in Column D of the other table, and moves to that row and grabs what is in column A. Do this for the other columns, only changing the Column base for each column. Fill down for the right number fo rows.

    After that you can copy that table, and then do a copy/paste values to remove the formulas.

    If you set it up with enough extra columns, you can use it as a template to drop data into, just modify the test column in the first equation. Then copy the results table back to your other sheet.

    To test for a combination of items being unique, just use concatenation in the test column in the first equation.

    Jason

    JaxJason
    3/31/11 @ 7:59 am

    My equation came out wrong, dropped my
    =if(a3/a2,D2+1,D2)

    JaxJason
    3/31/11 @ 8:01 am

    My equation came out wrong again, dropped my less than and greater thans
    =if(a3 &lt &gta2,D2+1,D2)

    brian
    6/3/11 @ 6:57 pm

    excel allows you do this with just 3 clicks. why can’t numbers do the same???

    Steve
    4/17/12 @ 3:05 pm

    Hmm…I have a list of approx 50,000 karaoke songs. I am trying to find a way to eliminated duplicate listings . In my Database, I have a column labeled Artist and another labeled title. Is there a way to automate marking duplicate listings referencing 2 columns?

      4/17/12 @ 3:15 pm

      You could always create a new formula in the 3rd column and have it check the first two columns vs the cells above them.

Comments Closed.