Selenium: not working fine with the docker

I want to run the selenium chrome driver my django project with the docker configuration so i followed this: [https://hub.docker.com/r/selenium/standalone-chrome/][1]

and I have created the celery task functionality is like this:

def setup_driver():
        try:
            chrome_options = Options()
            chrome_options.add_argument("--disable-gpu")
            chrome_options.add_argument("--no-sandbox")
            chrome_options.add_argument("--disable-dev-shm-usage")
            chrome_options.add_argument("--disable-blink-features=AutomationControlled")
            chrome_options.add_argument("--start-maximized")
            chrome_options.add_argument("user-agent=Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.0.0 Safari/537.36")
            driver = Remote(
                command_executor="http://selenium_grid:4444/wd/hub",  # Use service name!
                options=chrome_options
            )
            print("Successfully connected to Selenium Grid:", driver)
            return driver
        except Exception as e:
            print(f"Failed to connect to Selenium Grid: {e}")
            return None
   
def view_fun(request):
     try:
            driver = setup_driver()
            time.sleep(2)
            logger.info(f"{driver} setup successful")
    except Exception as e:
            print(f"An error occurred: {e}")
        finally:
            driver.quit()

So the error is like this:

celery_worker-1  | [2025-01-31 11:50:17,605: WARNING/ForkPoolWorker-4] Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NameResolutionError("<urllib3.connection.HTTPConnection object at 0x7f08d1d0d7f0>: Failed to resolve 'selenium_grid' ([Errno -2] Name or service not known)")': /wd/hub/session
celery_worker-1  | [2025-01-31 11:50:29,547: WARNING/ForkPoolWorker-4] Loading login page...
celery_worker-1  | [2025-01-31 11:50:29,547: WARNING/ForkPoolWorker-4] Login failed: 'NoneType' object has no attribute 'get'
celery_worker-1  | [2025-01-31 11:50:29,548: ERROR/ForkPoolWorker-4] scrape.tasks.run_weekly_scraping_zip[ebf4199d-61d4-4a3c-85ab-c64e228f3e2b]: Error in weekly scrape: 'NoneType' object has no attribute 'quit' 
celery_worker-1  |   File "/app/scrape/utils.py", line 256, in scrape_ziprecruiter_jobs                                             
celery_worker-1  |     driver.quit()                                                                                                
celery_worker-1  | AttributeError: 'NoneType' object has no attribute 'quit'

Also in my docker-compose file i have made chages to this:

web:
    build: .
    command: gunicorn jobscraperapp.wsgi:application --bind 0.0.0.0:8000 --workers 3 --timeout 300 --keep-alive 5
    volumes:
      - .:/app:rw
    env_file:
      - .env
    depends_on:
      - redis
      - selenium
    networks:
      - app_network
    ports:
      - "8000:8000"
    restart: always

  selenium:
    image: selenium/standalone-chrome:132.0-chromedriver-132.0-20250120
    networks:
      - app_network
    ports:
      - "4444:4444"

Also i have checked all the other question in stackoverflow but this query is not resolved, so if anyone can help me with this?

Вернуться на верх