I’m trying to scrape some data, but div element I need is covered by an element with a link. When trying to use this code:
pts1 = []
pts2 = []
table = driver.find_elements(By.XPATH, '//article[contains(@class,"game-card-view")]')
for match in table:
pts1.append(driver.find_element(By.XPATH, './div/div[3]/div[1]/span').text)
pts2.append(driver.find_element(By.XPATH, './div/div[3]/div[2]/span').text)
I get NoSuchElementException. There are no iframes and xpath is correct. Problem seems to be that the stuff I need is behind another element:
<article class="game-card-view">
**<a href="URL">**
<div>
<div>
<div>
<div>
<div>
<span>
<div>
<span>
My guess is that element is the reason why I can’t scrape the data I need since it covers the whole div element. Is there a way to bypass that element?