How to find all pages of the website

I want to create a service that gives the website as input from the array (inputData), get website
by _website key. Scan any website (no matter what kind of website
it will be, scan the whole site, not just the home page), merge each scanned website data
into the array. When finished scanning the websites, display the result on the console.

Input :
const inputData = [
    {_website:['https://example1.com/']}, 
    {_website:['https://example2.com']}
];


For example, the output must be like this:
Output : [
  {
      _website:['https://example1.com/'],
      _link:['https://example1.com/about'],
      _statusCode:[200],
  } ,
  {
      _website:[],
      _link:['https://example1.com/blog'],
      _statusCode:[200],
  },
  {
      _website:[],
      _link:['https://example1.com/shop'],
      _statusCode:[200],
  },
//...
  {
      _website:['https://example2.com/'],
      _link:['https://example1.com/about'],
      _statusCode:[200],
  } ,
  {
      _website:[],
      _link:['https://example2.com/brand'],
      _statusCode:[200],
  } ,
  {
      _website:[],
      _link:['https://example2.com/blog'],
      _statusCode:[200],
  } ,
//...
]

GitHub - gocolly/colly: Elegant Scraper and Crawler Framework for Golang ?

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.