»Ë»Ñ

°³¹ßÀÚÆ÷·³ ÀÔ´Ï´Ù.

ºÏ¸¶Å© ¾ÆÀÌÄÜ

IT °³¹ßÀÚ°£ Á¤º¸¸¦ °øÀ¯ÇÏ°í ³íÀǰ¡ ÀÌ·ç¾îÁö´Â °ø°£ÀÔ´Ï´Ù.
ÆÄÀ̽ã Å©·Ñ¸µ for ½ÇÇà¼Óµµ °³¼± ¹æ¹ý ¹®Àǵ帳´Ï´Ù. 8
ºÐ·ù: Áú¹®
À̸§: ÀúÆÒºê·£µå


µî·ÏÀÏ: 2020-01-25 15:02
Á¶È¸¼ö: 2538





ȸ»ç ȨÇÇ ³»ºÎ¸Á¿¡¼­ ƯÁ¤ ÀڷḦ ±Ü´Âµ¥

for¹®À» ÀÌÁßÀ¸·Î ½á¼­ ±×·±°¡ ¹«ÁöÇÏ°Ô ´À¸®³×¿ä.

 

1ÃÊ¿¡ ÇÑÁÙ¾¿ °¡Á®¿É´Ï´Ù.

´Ù µûÁ®º¸¸é ¸îõÁÙÀÏ ÅÙµ¥¿ä.

 

Ãʺ¸¶ó ÀÎÅͳÝÀ¸·Î Å©·Ñ¸µ º¸°í ´Ü¼ø Á¶¾ÇÇÏ°Ô ¸¸µé¾ù´Âµ¥

¾î¶²°É ½á¾ßÁö ¼Óµµ°¡ »¡¶óÁú°¡¿ä?

 

ÃÖÃÊ ±¸»óÀº 

 

 URL 1~44±îÁö Àаí

 soup.select·Î a ű׸¦ 1~50±îÁö °¡Á®¿Í¶ó (f½ºÆ®¸µ »ç¿ë)

 °á°ú Ãâ·Â

 

ÀÌ·¸°Ô ÇÏ·Á°í »ý°¢Çϴ°̴ϴÙ.

 

import requests

from bs4 import BeautifulSoup as bs



for i in range(1,50):

    for page in range(1,44):



        url = '³»ºÎÁÖ¼Ò' + str(page)

        res= requests.get(url)

        soup = bs(res.content, 'html.parser')

        link = soup.select(f'body > div:nth-child(5) > center:nth-child(2) > div:nth-child({i}) > a')

        print(link)
Ãßõ0 ´Ù¸¥ Àǰß0

´Ù¸¥ÀÇ°ß 0 Ãßõ 0 ÀúÆÒºê·£µå
2020-01-25

´Ù¸¥ÀÇ°ß 0 Ãßõ 0 ½´°¡ÇÁ·Î±×·¡¹Ö
2020-01-25

´Ù¸¥ÀÇ°ß 0 Ãßõ 0 ÀúÆÒºê·£µå
2020-01-26

´Ù¸¥ÀÇ°ß 0 Ãßõ 0 ÆäÄÈ
2020-01-25

´Ù¸¥ÀÇ°ß 0 Ãßõ 0 ÆäÄÈ
2020-01-25

´Ù¸¥ÀÇ°ß 0 Ãßõ 0 ÀúÆÒºê·£µå
2020-01-26

´Ù¸¥ÀÇ°ß 0 Ãßõ 0 ³ë¿¹_33³â
2020-01-28

´Ù¸¥ÀÇ°ß 0 Ãßõ 0 Lllux
2020-01-29
  • ¿å¼³, »óó¸¦ ÁÙ ¼ö ÀÖ´Â ´ñ±ÛÀº »ï°¡ÁÖ¼¼¿ä.
©¹æ »çÁø  À͸í¿ä±¸    
¡â ÀÌÀü±Û¡ä ´ÙÀ½±Û -¸ñ·Ïº¸±â