Internal duplication is rarely bad for SEO. These days Googlebot is very good at detecting duplication and handling it appropriately.
Yes, Googlebot is likely to find and crawl the duplicate URLs eventually. However, in the case that Googlebot finds two URLs on the same site with the same content, it simply picks one to index. The one that it chooses is going to be either the one it found first, or the one with higher Pagerank. In all cases that is likely to be the one you have linked.
Google won't hand out any penalties for internal duplication. The worst thing that can happen is that Google will occasionally index a page on a URL that you would not prefer. It is also possible that Googlebot will use a lot of bandwidth and crawl budget crawling duplicate sections of your site that won't get indexed.
Other answers correctly tell you how to fix the problem, but I wanted to give a realistic expectation about how "bad" it could be.
See also: What is duplicate content and how can I avoid being penalized for it on my site?