»Ë»Ñ Æ÷·³
IT °³¹ßÀÚ°£ Á¤º¸¸¦ °øÀ¯ÇÏ°í ³íÀÇ°¡ ÀÌ·ç¾îÁö´Â °ø°£ÀÔ´Ï´Ù.

ÆÄÀ̽ã Å©·Ñ¸µ for ½ÇÇà¼Óµµ °³¼± ¹æ¹ý ¹®Àǵ帳´Ï´Ù.8

ȸ»ç ȨÇÇ ³»ºÎ¸Á¿¡¼­ ƯÁ¤ ÀڷḦ ±Ü´Âµ¥
for¹®À» ÀÌÁßÀ¸·Î ½á¼­ ±×·±°¡ ¹«ÁöÇÏ°Ô ´À¸®³×¿ä.
 
1ÃÊ¿¡ ÇÑÁÙ¾¿ °¡Á®¿É´Ï´Ù.
´Ù µûÁ®º¸¸é ¸îõÁÙÀÏ ÅÙµ¥¿ä.
 
Ãʺ¸¶ó ÀÎÅͳÝÀ¸·Î Å©·Ñ¸µ º¸°í ´Ü¼ø Á¶¾ÇÇÏ°Ô ¸¸µé¾ù´Âµ¥
¾î¶²°É ½á¾ßÁö ¼Óµµ°¡ »¡¶óÁú°¡¿ä?
 
ÃÖÃÊ ±¸»óÀº 
 
 URL 1~44±îÁö Àаí
 soup.select·Î a ű׸¦ 1~50±îÁö °¡Á®¿Í¶ó (f½ºÆ®¸µ »ç¿ë)
 °á°ú Ãâ·Â
 
ÀÌ·¸°Ô ÇÏ·Á°í »ý°¢Çϴ°̴ϴÙ.
 
import requests
from bs4 import BeautifulSoup as bs

for i in range(1,50):
    for page in range(1,44):

        url = '³»ºÎÁÖ¼Ò' + str(page)
        res= requests.get(url)
        soup = bs(res.content, 'html.parser')
        link = soup.select(f'body > div:nth-child(5) > center:nth-child(2) > div:nth-child({i}) > a')
        print(link)

0
ÃßõÇϱ⠴ٸ¥ÀÇ°ß 0
ºÏ¸¶Å©¹öÆ° °øÀ¯¹öÆ°

´Ù¸¥ÀÇ°ß 0 Ãßõ 0 ÀúÆҺ귣µå
2020-01-25 Á¡¾ÆÀÌÄÜ
  1. ´ñ±ÛÁÖ¼Òº¹»ç

´Ù¸¥ÀÇ°ß 0 Ãßõ 0 ½´°¡ÇÁ·Î±×·¡¹Ö
2020-01-25 Á¡¾ÆÀÌÄÜ
  1. ´ñ±ÛÁÖ¼Òº¹»ç

´Ù¸¥ÀÇ°ß 0 Ãßõ 0 ÀúÆҺ귣µå
2020-01-26 Á¡¾ÆÀÌÄÜ
  1. ´ñ±ÛÁÖ¼Òº¹»ç

´Ù¸¥ÀÇ°ß 0 Ãßõ 0 ÆäÄÈ
2020-01-25 Á¡¾ÆÀÌÄÜ
  1. ´ñ±ÛÁÖ¼Òº¹»ç

´Ù¸¥ÀÇ°ß 0 Ãßõ 0 ÆäÄÈ
2020-01-25 Á¡¾ÆÀÌÄÜ
  1. ´ñ±ÛÁÖ¼Òº¹»ç

´Ù¸¥ÀÇ°ß 0 Ãßõ 0 ÀúÆҺ귣µå
2020-01-26 Á¡¾ÆÀÌÄÜ
  1. ´ñ±ÛÁÖ¼Òº¹»ç

´Ù¸¥ÀÇ°ß 0 Ãßõ 0 ³ë¿¹_33³â
2020-01-28 Á¡¾ÆÀÌÄÜ
  1. ´ñ±ÛÁÖ¼Òº¹»ç

´Ù¸¥ÀÇ°ß 0 Ãßõ 0 Lllux
2020-01-29 Á¡¾ÆÀÌÄÜ
  1. ´ñ±ÛÁÖ¼Òº¹»ç
  • ¾Ë¸² ¿å¼³, »óó ÁÙ ¼ö ÀÖ´Â ¾ÇÇÃÀº »ï°¡ÁÖ¼¼¿ä.
©¹æ »çÁø  
¡â ÀÌÀü±Û¡ä ´ÙÀ½±Û -¸ñ·Ïº¸±â