Remove Duplicates from an Unsorted Linked list

Objective: Write a program to remove the duplicates from an unsorted linked list

Example:

Input Linked List : 1->2->2->4->3->3->2
Output : 1->2->4->3

Input: An unsorted linked list

Output: Linked list with no duplicates.

Approach:

  • Create a Hash Table
  • Take two pointers, prevNode and CurrNode.
  • PrevNode will point to the head of the linked list and currNode will point to the head.next.

  • Now navigate through the linked list.
  • Check every node data is present in the HashTable.
  • if yes then delete that node using prevNode and currNode.
  • If No, then insert that node data into the linked list
  • Return the head of the list

Time Complexity : O(n)
Space Complexity : O(n)

Follow Up: If suppose addition buffer is not allowed then we have option but to check every node data against every other node data and if find duplicates, delete that node.
Time Complexity : O(n^2)

Complete Code for the Hash Table method:


Output:

Original List : ->1->2->2->3->4->4->2
Updated List: ->1->2->3->4

__________________________________________________
Top Companies Interview Questions..-

Google Microsoft Amazon Facebook more..

If you find anything incorrect or you feel that there is any better approach to solve the above problem, please write comment.
__________________________________________________

  • deen john

    Hi , I think there is one problem with the code .Though the result is correct but current Node would still point to next so it won’t be garbage collected .Instead current node.next should be equal to null to make it garbage collected. I have commented the correct solution :

    while(currNode!=null){
    int data = currNode.data;
    if(ht.contains(data)){
    prevNode.next = currNode.next;

    //temp= currNode;
    currNode = currNode.next;
    //temp.next=null;

    • Aditi Mangal

      There is another problem with this code.try running the code with this list : 2223442
      the output coming is: 2234 while it should be 234

      • tutorialhorizon

        Hi Aditi, Thanks for pointing the problem, i have corrected the code. Please let me know if you find problems in other posts.

  • deen john

    Hi , I think there is one problem with the code .Though the result is correct but current Node would still point to next so it won’t be garbage collected .Instead current node.next should be equal to null to make it garbage collected. I have commented the correct solution :

    while(currNode!=null){
    int data = currNode.data;
    if(ht.contains(data)){
    prevNode.next = currNode.next;

    //temp= currNode;
    currNode = currNode.next;
    //temp.next=null;
    Please let me know if i am wrong

    • tutorialhorizon

      Thanks John, somehow i missed you comment, Implemented your suggestion. Thanks

  • http://www.bitsa.in/ Aravind

    Why are you using Hash instead of Array? @tutorialhorizon

  • Alex Chebykin

    There actually exists a solution with time complexity O(n + m) and space complexity O(1).
    Hint: suppose the lists are the same length.
    Actual solution:
    NodeList findIntersectionNoMemoryElegant(NodeList other) {
    int thisLen = findLength();
    int otherLen = other.findLength();
    NodeList fst = this;
    NodeList snd = other;
    int diff = thisLen – otherLen;
    while (diff > 0) {
    fst = fst.next;
    diff–;
    }
    while (diff < 0) {
    snd = snd.next;
    diff++;
    }
    while (fst != null && snd != null) {
    if (fst == snd) return fst;
    fst = fst.next;
    snd = snd.next;
    }
    return null;
    }

  • biswajit singh

    hey i think we can do it simply using a hashset. suppose we have 2 pointers
    node prev=null and node current =head
    we maintain a hashset say h1.
    so we could just use condition below to remove duplicates
    while(current!=null)
    {
    if(h1.contain(current.data)
    {
    prev.next=current.next;
    current=current.next;
    }
    else
    {
    h1.put(current.data)
    prev=current;
    current=current.next;
    }

    i think this is gonna work as well. please correct me if you find any mistake .

%d bloggers like this: