Using loop for web links

Hawk81

I want to download a data from the web, but the code is too long and prone to make an error. Is there any way to use loop for web links? The only value that changes is a number of weeks.

Small example from my code:

library(XML)

# import - week 1
data11=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=1&mid1=1&mid2=2")
data12=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=1&mid1=3&mid2=4")
data13=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=1&mid1=5&mid2=6")
data14=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=1&mid1=7&mid2=8")
data15=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=1&mid1=9&mid2=10")
data16=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=1&mid1=11&mid2=12")

data11 <- data11[[4]] 
data12 <- data12[[4]]
data13 <- data13[[4]]
data14 <- data14[[4]]
data15 <- data15[[4]]
data16 <- data16[[4]]

mlb.data1 <- rbind(data11, data12, data13, data14, data15, data16) 

# import - week 2
data11=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=2&mid1=1&mid2=2")
data12=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=2&mid1=3&mid2=4")
data13=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=2&mid1=5&mid2=6")
data14=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=2&mid1=7&mid2=8")
data15=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=2&mid1=9&mid2=10")
data16=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=2&mid1=11&mid2=12")

data11 <- data11[[4]] 
data12 <- data12[[4]]
data13 <- data13[[4]]
data14 <- data14[[4]]
data15 <- data15[[4]]
data16 <- data16[[4]]

mlb.data2 <- rbind(data11, data12, data13, data14, data15, data16) 

# import - week 3
data11=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=3&mid1=1&mid2=2")
data12=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=3&mid1=3&mid2=4")
data13=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=3&mid1=5&mid2=6")
data14=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=3&mid1=7&mid2=8")
data15=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=3&mid1=9&mid2=10")
data16=readHTMLTable(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=3&mid1=11&mid2=12")

data11 <- data11[[4]] 
data12 <- data12[[4]]
data13 <- data13[[4]]
data14 <- data14[[4]]
data15 <- data15[[4]]
data16 <- data16[[4]]

mlb.data3 <- rbind(data11, data12, data13, data14, data15, data16) 

# add number of week
mlb.data1$week      <- 1
mlb.data2$week      <- 2
mlb.data3$week      <- 3

# complete table
mlb.complet <- rbind(mlb.data1, mlb.data2, mlb.data3)
zx8754

This should work, note that link is returning a list of 2 tables, you need to clean it up after readHTMLTable function.

output <- 
  do.call(rbind,
          lapply(1:2, function(week){
            do.call(rbind,
                    lapply(seq(2,12,2),function(id){
                      x <- readHTMLTable(paste0(doc = "http://baseball.fantasysports.yahoo.com/b1/2276/matchup?week=",week,"2&mid1=1&mid2=",id))
                      #choose which tables to keep
                      res <- x$statTable3
                      res$WEEK <- week
                      res$ID <- id
                      res
                    }))
          })
  )

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related

From Dev

Using loop for web links

From Dev

Web scraping into R multiple links with similar URL using a for loop or lapply

From Dev

Loop through links using Selenium Webdriver (Python)

From Java

How do you correctly filter links using .find in a for loop with BeautifulSoup?

From Dev

How to access href of array of links in a loop using watir-webdriver

From Dev

Links not working when using ngFor loop with pageScroll attribute

From Dev

How to move links in the middle of the Web page using html/css?

From Dev

How to get all links on a web page using python and selenium IDE

From Dev

How to scrape page links from dynamic web page using PHP?

From Dev

How to move links in the middle of the Web page using html/css?

From Dev

Using Shell CMD.exe start with Web Links

From Dev

How to get links of from google map using web crawl?

From Dev

Using For Loop with Weather Java Web Service

From Dev

Creating a table by web-scraping using a loop

From Dev

Loop Json Links Swift

From Dev

Web crawler - following links

From Dev

web crawling a table of links

From Dev

Handling web links in Python

From Dev

how to Click and check the hyper Links in the table columns of webpage using Selenium web driver(Java code)

From Dev

How can I get the links from a web page using WWW::Mechanize and Perl

From Dev

How to extract links behind a text tag of web page (using either curl,wget or userscript)

From Dev

Using purrr:map to loop through web pages for scraping with Rselenium

From Dev

Python web scraping using BeautifulSoup, Loop and skip certain URL value

From Dev

Web scraping with a double loop with selenium and using By.SELECTOR

From Dev

Automation Fixing broken Web Links

From Dev

Some web page links not working

From Dev

Joomla Web links with subcategories not showing

From Dev

A tool for checking broken web links

From Dev

Get Links From Web Page

Related Related

  1. 1

    Using loop for web links

  2. 2

    Web scraping into R multiple links with similar URL using a for loop or lapply

  3. 3

    Loop through links using Selenium Webdriver (Python)

  4. 4

    How do you correctly filter links using .find in a for loop with BeautifulSoup?

  5. 5

    How to access href of array of links in a loop using watir-webdriver

  6. 6

    Links not working when using ngFor loop with pageScroll attribute

  7. 7

    How to move links in the middle of the Web page using html/css?

  8. 8

    How to get all links on a web page using python and selenium IDE

  9. 9

    How to scrape page links from dynamic web page using PHP?

  10. 10

    How to move links in the middle of the Web page using html/css?

  11. 11

    Using Shell CMD.exe start with Web Links

  12. 12

    How to get links of from google map using web crawl?

  13. 13

    Using For Loop with Weather Java Web Service

  14. 14

    Creating a table by web-scraping using a loop

  15. 15

    Loop Json Links Swift

  16. 16

    Web crawler - following links

  17. 17

    web crawling a table of links

  18. 18

    Handling web links in Python

  19. 19

    how to Click and check the hyper Links in the table columns of webpage using Selenium web driver(Java code)

  20. 20

    How can I get the links from a web page using WWW::Mechanize and Perl

  21. 21

    How to extract links behind a text tag of web page (using either curl,wget or userscript)

  22. 22

    Using purrr:map to loop through web pages for scraping with Rselenium

  23. 23

    Python web scraping using BeautifulSoup, Loop and skip certain URL value

  24. 24

    Web scraping with a double loop with selenium and using By.SELECTOR

  25. 25

    Automation Fixing broken Web Links

  26. 26

    Some web page links not working

  27. 27

    Joomla Web links with subcategories not showing

  28. 28

    A tool for checking broken web links

  29. 29

    Get Links From Web Page

HotTag

Archive