Æüµ­/2020-7-23¤è¤êÁ°¤Î5Æüʬ

2020-7-12

¼¼Æâ¤ÎCO2Ç»ÅÙ¤¬¸«¤¿¤¤¡¦Â³Â³

¡¡

¡¡¡Ö¼¼Æâ¤ÎCO2Ç»ÅÙ¤¬¸«¤¿¤¤¡×¡¢¡Ö¼¼Æâ¤ÎCO2Ç»ÅÙ¤¬¸«¤¿¤¤¡¦Â³¡×¤Î³¤­¡£
¡¡¼¼Æâ¤ÎCO2Ç»ÅÙ¤òLametric Time¤Çɽ¼¨¤Ç¤­¤ë¤è¤¦¤Ë¤Ê¤Ã¤¿¤¬¡¢¼«¸ÊËþ­¤Ä¤¤¤Ç¤ËÃͤοä°Ü¤ò¥°¥é¥Õ¤Ç¸«¤¿¤¯¤Ê¤Ã¤¿¡£
¡¡ºòÆü·ÇºÜ¤·¤¿¥¹¥¯¥ê¥×¥È¡Êroom.py¡Ë¤Ï¡¢cron¤Ç5ʬ¤ª¤­¤Ë¼Â¹Ô¤·¤Æ¤¤¤ë¤Î¤À¤¬¡¢¥í¥°¤È¤·¤Æ¡¢¼¼²¹¡¢¼¾ÅÙ¡¢CO2Ç»ÅÙ¤òroom.log¤Ø½ÐÎϤ·¤Æ¤¤¤ë¡£

 */5 * * * * cd /home/pi/lametric/;./room.py >> /home/pi/lametric/room.log 2> /dev/null

¡¡room.log¤ÎÆâÍƤϰʲ¼¤Î¤è¤¦¤Ê´¶¤¸¡£

 date,temperature,humidity,co2
 2020-07-11 13:50,27,68,513
 2020-07-11 13:55,27,68,519
 2020-07-11 14:00,27,68,507
 2020-07-11 14:05,27,68,508
 2020-07-11 14:10,27,68,505

¡¡¤³¤ì¤ò¥°¥é¥Õ²½¤·¤Æ¤ß¤¿¡£

 #!/usr/bin/env python3
 import pandas as pd
 import matplotlib.pyplot as plt
 import re
 import pprint
 
 df = pd.read_csv('room.log', index_col='date')
 
 period = int(-1 * (60 / 5) * 24 * 1)
 ltst = df[period:].interpolate()
 data1 = ltst.loc[:, 'temperature']
 data2 = ltst.loc[:, 'humidity']
 data3 = ltst.loc[:, 'co2']
 
 xt = []
 xl = []
 idx = ltst.index.values.tolist()
 for i in idx:
     if '00:00' in i:
         xt.append(i)
         xl.append(re.search(r'(\d\d-\d\d) ', i).group())
     elif '12:00' in i:
         xt.append(i)
         xl.append('')
     else:
         xt.append('')
         xl.append('')
 
 plt.style.use('seaborn-darkgrid')
 
 fig, [ax1, ax2, ax3] = plt.subplots(3, 1, sharex='col')
 fig.set_figwidth(12.8)
 fig.set_figheight(9.6)
 
 ax1.plot(data1, color='indianred')
 ax1.set_ylabel('temperature')
 
 ax2.plot(data2, color='royalblue')
 ax2.set_ylabel('humidity')
 
 ax3.plot(data3, color='seagreen')
 ax3.set_ylabel('co2')
 ax3.set(xticks=xt, xticklabels=xl)
 
 plt.tight_layout()
 plt.savefig('graph.png')

¡¡°Ê²¼¤¬¤Ç¤­¤¿¥°¥é¥Õ¡Êgraph.png¡Ë¡£°ìÆüʬ¤Î¥°¥é¥Õ¤Ç¡¢9¹ÔÌܤÇÊÑ¿ôperiod¤ÎÃͤò·è¤á¤Æ¤¤¤ë¡¢±¦ÊդκǸå¤Î¿ô»ú¤¬Æü¿ô¤òɽ¤·¤Æ¤¤¤ë¡£

¡¡12»þ²á¤®¤ÎCO2Ç»Å٤ε޾徺¤Ï¡¢ÃëÈӤǥ³¥ó¥í¤Ë²Ð¤òÉÕ¤±¤¿¤¿¤á¤È»×¤ï¤ì¤ë¡£´¹µ¤¤·¤¿¤éÃͤ¬°ìµ¤¤Ë²¼¤¬¤ê¡¢°ì½ï¤Ë²¹ÅÙ¡¢¼¾ÅÙ¤âµÞ¹ß²¼¤·¤Æ¤¤¤ë¤³¤È¤¬¸«¤Æ¼è¤ì¤ë¡£
¡¡¤À¤«¤é¤Ê¤Ë¡¢¤Ã¤ÆÏäǤ¹¤¬¡£
¡¡

»²¹Í


2020-7-11

¼¼Æâ¤ÎCO2Ç»ÅÙ¤¬¸«¤¿¤¤¡¦Â³

¡¡

¡¡ÀèÆü½ñ¤¤¤¿¡Ö¼¼Æâ¤ÎCO2Ç»ÅÙ¤¬¸«¤¿¤¤¡×¤Î³¤­¡£
¡¡2020/06/20¤ËBanggood.com¤ËȯÃí¤·¤¿CO2Ç»ÅÙ¥»¥ó¥µ¡¼¡ÖMH-Z19¡×¤À¤¬¡¢

¡¡ºòÆü¡Ê2020/07/10¡ËÆϤ¤¤¿¡£È¯Ãí¤«¤éÅþÃå¤Þ¤Ç20Æü´Ö¤«¤«¤Ã¤¿¤³¤È¤Ë¤Ê¤ë¡ÊǼ´ü¤Ï10Æü¤«¤é30Æü¤È¤Ê¤Ã¤Æ¤¤¤¿¡Ë¡£
¡¡¤Ê¤ª¡¢ÆϤ¤¤¿¤Î¤Ï¡ÖMH-Z19B¡×¤À¤Ã¤¿¡£

¡¡Áᮡ¢¤³¤Á¤é¤Î¥Ú¡¼¥¸¤ò»²¹Í¤Ë¡¢¤â¤È¤â¤È¶á½ê¤Î¥¢¥á¥À¥¹¤Î¾ðÊó¤äNature Remo¤ÎÆ⢥»¥ó¥µ¡¼¤ÎÃͤòLametric Time¤Øɽ¼¨¤¹¤ë¤¿¤á¤Ë»È¤Ã¤Æ¤¤¤¿Raspberry Pi 3B¤ËÀܳ¤·¤Æ¤ß¤¿¡£¶â°À½¤ÎÂæ¤Î¾å¤ËÃÖ¤¤¤Æ¤¢¤ë¤¿¤á¡¢¥»¥ó¥µ¡¼¤Î²¼¤Ë¥À¥ó¥Ü¡¼¥ë¤òŽ¤êÉÕ¤±¤Æ¤¢¤ë¡£

mh_z19b.jpg

¡¡¥»¥ó¥µ¡¼¤ÎÃͤâÌäÂê¤Ê¤¯¼èÆÀ¤Ç¤­¤¿¤Î¤Ç¡¢¤³¤ì¤òLametric Time¤Ëɽ¼¨¤·¤Æ¤ß¤¿¡£

 #!/usr/local/bin/python3
 import requests
 import json
 import datetime
 import subprocess
 
 url = 'https://api.nature.global/1/devices'
 
 headers =  {
     'contentType': 'application/json',
     'Authorization': 'Bearer xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'
 }
 
 res = requests.get(url, headers=headers)
 data = res.json()
 
 hum = str(data[0]['newest_events']['hu']['val'])
 temp = str(round(data[0]['newest_events']['te']['val'], 1))
 
 mh = subprocess.check_output(['sudo', 'python3', '-m', 'mh_z19']).decode('utf-8')
 mh = json.loads(mh)
 co2 = str(mh['co2'])
 
 print(f"{datetime.datetime.today().strftime('%Y-%m-%d %H:%M')},{temp},{hum},{co2}")
 
 hum = hum + '%'
 temp = temp + '¡ëC'
 co2 = co2 + 'ppm'
 
 disp = {
     'frames': [
         {
             'index' : 0,
             'text'  : temp,
             'icon'  : '12464'
         },
         {
             'index' : 1,
             'text'  : hum,
             'icon'  : '12184'
         },
         {
             'index' : 2,
             'text'  : co2,
             'icon'  : '32936'
         }
     ]
 }
 
 disp = json.dumps(disp)
 
 headers = {
     'X-Access-Token': 'yyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy',
     'Cache-Control': 'no-cache',
     'Accept': 'application/json'
 }
 
 url = "https://developer.lametric.com/api/V1/dev/widget/update/com.lametric.zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz/1"
 
 res = requests.post(url, disp, headers=headers)

¡¡¾åµ­¤Î¥¹¥¯¥ê¥×¥È¤òcron¤Ç5ʬ¤ª¤­¤Ë¼Â¹Ô¤·¡¢É½¼¨¤¹¤ë¾ðÊó¤ò¹¹¿·¤·¤Æ¤¤¤ë¡£

¡¡É½¼¨¤Ï½ç¤Ë¡¢»þ¹ï -> ÆüÉÕ -> ¶á½ê¤Î¥¢¥á¥À¥¹¤Îµ¤²¹ -> Å·µ¤¤È¹ß¿åÎÌ -> É÷® -> ¼¼²¹ -> ¼¾ÅÙ -> CO2Ç»ÅÙ ¤È¤Ê¤Ã¤Æ¤¤¤ë¡Ê¥¢¥á¥À¥¹¤Î¾ðÊó¤ÏÊ̤Υ¹¥¯¥ê¥×¥È¤Ç¹¹¿·¤·¤Æ¤¤¤ë¡Ë¡£

¡¡¤Þ¤¢¡¢¼«¸ÊËþ­°Ê³°¤Î²¿Êª¤Ç¤â¤Ê¤¤¤¬¡¢Ëþ­Ëþ­¡£
¡¡

»²¹Í


2020-7-5

Google¥«¥ì¥ó¥À¡¼¤Ëºå¿À¤È¥×¥íÌîµå¤ÎÆüÄø¤ò¥¤¥ó¥Ý¡¼¥È¤¹¤ë¤¿¤á¤Î¥¹¥¯¥ì¥¤¥Ô¥ó¥°¡¦2020ǯÈÇ

¡¡

calendar.png

¡¡Àè·î¡Ê2020/06/19¡Ë¤È¤¦¤È¤¦¥×¥íÌîµå¤¬³«Ë뤷¤¿¡£ºå¿À¥Õ¥¡¥ó¤È¤·¤Æ¤Ï¡¢2020/07/05¸½ºß¡¢Á᤯¤â¥·¡¼¥º¥ó½ª¤ï¤Ã¤¿¤ó¤¸¤ã¤Í¤¨¤«¤È¤â»×¤¨¤ë¾õ¶·¤À¤¬¡¢¤½¤ì¤Ï¤½¤ì¤È¤·¤Æ¡¢ºòǯ°Ê²¼¤Î¤è¤¦¤Ê¥¹¥¯¥ê¥×¥È¤ò¾Ò²ð¤·¤Æ¤¤¤¿¡£

¡¡Áᮺ£Ç¯¤ÎÆüÄø¤òGoogle¥«¥ì¥ó¥À¡¼¤Ë¥¤¥ó¥Ý¡¼¥È¤·¤è¤¦¤È¤·¤Æ¤ß¤¿¤Î¤À¤¬¡¢¤³¤ì¤¬ÁÇľ¤ËÆ°¤¤¤Æ¤¯¤ì¤Ê¤¤¡£
¡¡°Ê²¼¤ËÆ°¤«¤¹¤Þ¤Ç¤Ë¤ä¤Ã¤¿¤³¤È¤ò¼¨¤¹¡£

ºå¿À¥¿¥¤¥¬¡¼¥¹¤ÎÆüÄø

¡¡¤Þ¤º¤Ï¥¹¥¯¥ê¥×¥È¤Î½¤Àµ¡£

  • ¥ª¥ê¥Ã¥¯¥¹¤¬¡Öbs¡×¤À¤Ã¤¿¤Î¤¬¡Öb¡×¤ËÊѤï¤Ã¤Æ¤¿¤Î¤ËÂбþ¡£
  • ǯ¤ò¡Ö2020¡×¤Ë¡¢Æü¤Ë¤Á¤òº£Ç¯¤ÎÊѧÆüÄø¤Ë±þ¤¸¤ÆÊѹ¹¤·¤¿¡£
  • ¤¤¤Á¤¤¤Á¡ÖJERA¥»¡¦¥ê¡¼¥°¸ø¼°Àï¡×¤Èɽ¼¨¤µ¤ì¤ë¤Î¤¬¤¦¤ë¤µ¤¤¤Î¤Ç¾Ã¤·¤¿¡£
 #!/usr/bin/python3
 #coding: utf-8
 
 #scrapingtigers.py
 
 import re
 import datetime
 import urllib.request
 import requests
 import pprint
 from bs4 import BeautifulSoup
 
 data = {}
 
 year = '2020'
 
 team = {
     't':'ºå¿À',
     's':'¥ä¥¯¥ë¥È',
     'd':'ÃæÆü',
     'h':'¥½¥Õ¥È¥Ð¥ó¥¯',
     'e':'³ÚÅ·',
     'f':'ÆüËܥϥà',
     'l':'À¾Éð',
     'db':'DeNA',
     'm':'¥í¥Ã¥Æ',
     'b':'¥ª¥ê¥Ã¥¯¥¹',
     'g':'µð¿Í',
     'c':'¹­Åç',
 }
 
 head = "Subject, Start Date, Start Time, End Date, End Time, Description, Location"
 print(head)
 
 #month_days = {'03':'31', '04':'30', '05':'31', '06':'30', '07':'31', '08':'31', '09':'30'}
 month_days = {'06':'30', '07':'31', '08':'31', '09':'30', '10':'31', '11':'30'}
 
 for month in month_days.keys():
     data.setdefault(month, {})
     for day in range(int(month_days[month])):
         data[month].setdefault(day + 1, {})
         data[month][day + 1].setdefault('date', year + '/' + month + '/' + ('0' + str(day + 1))[-2:])
 
 for month in month_days.keys():
     html = requests.get("https://m.hanshintigers.jp/game/schedule/" + year + "/" + month + ".html")
     soup = BeautifulSoup(html.text, features="lxml")
     day = 1
     for tag in soup.select('li.box_right.gameinfo'):
         text = re.sub(' +', '', tag.text)
         info = text.split("\n")
         if len(info) > 3:
             if info[1] == '\xa0' or re.match('JERA¥»¡¦¥ê¡¼¥°¸ø¼°Àï', info[1]):
                 info[1] = ''
             data[month][day].setdefault('gameinfo', info[1])
             data[month][day].setdefault('start', info[2])
             data[month][day].setdefault('stadium', info[3])
             if re.match('¥ª¡¼¥ë¥¹¥¿¡¼¥²¡¼¥à', info[2]):
                 data[month][day]['gameinfo'] = info[2]
                 data[month][day]['start'] = '18:00'
 
         text = str(tag.div)
         if text:
             m = re.match(r'^.*"nologo">(\w+)<.*$', text, flags=(re.MULTILINE|re.DOTALL))
             if m:
                 gameinfo = m.group(1)
                 data[month][day].setdefault('gameinfo', gameinfo)
             m = re.match(r'^.*"logo_left (\w+)">.*$', text, flags=(re.MULTILINE|re.DOTALL))
             if m:
                 team1 = m.group(1)
                 data[month][day].setdefault('team1', team[team1])
             m = re.match(r'^.*"logo_right (\w+)">.*$', text, flags=(re.MULTILINE|re.DOTALL))
             if m:
                 team2 = m.group(1)
                 data[month][day].setdefault('team2', team[team2])
 
         day += 1
 
 for month in month_days.keys():
     for day in data[month].keys():
         if data[month][day].get('start'):
             m = re.match(r'(\d+):(\d+)', data[month][day]['start'])
             if m:
                 sthr = m.group(1)
                 stmn = m.group(2)
                 start = datetime.datetime(int(year), int(month), int(day), int(sthr), int(stmn), 0)
                 delta = datetime.timedelta(hours=4)
                 end = start + delta
                 sttm = start.strftime("%H:%M:%S")
                 entm = end.strftime("%H:%M:%S")
                 summary = ''
                 if data[month][day]['gameinfo']:
                     summary = data[month][day]['gameinfo'] + " "
                 if not re.match('¥ª¡¼¥ë¥¹¥¿¡¼¥²¡¼¥à', data[month][day]['gameinfo']):
                     summary += data[month][day]['team1'] + "ÂÐ" + data[month][day]['team2']
                 #head = "Subject, Start Date, Start Time, End Date, End Time, Description, Location"
                 print(f"{summary}, {data[month][day]['date']}, {sttm}, {data[month][day]['date']}, {entm}, {summary}, {data[month][day]['stadium']}")

¡¡¤³¤ì¤°¤é¤¤¤Î½¤Àµ¤ÇÆ°¤­¤½¤¦¤Ê¤â¤Î¤À¤¬¡Ê´Ä¶­¤ÏWindow10¤ÎWSL¡ÊUbuntu20.04¡Ë¡Ë¡¢

SSL routines:tls12_check_peer_sigalg:wrong signature type

¤ß¤¿¤¤¤Ê¥¨¥é¡¼¤òÅǤ¤¤Æ»ß¤Þ¤Ã¤Æ¤·¤Þ¤¦¡£
¡¡¸¶°ø¤Ï¡¢Ubuntu20.04¤Ë¥Ç¥Õ¥©¥ë¥È¤ÇÆþ¤Ã¤Æ¤ëOpenSSL¤Î¥Ð¡¼¥¸¥ç¥ó¤¬¸Å¤¤¤¿¤á¡£1.1.1f¤¬Æþ¤Ã¤Æ¤ë¤ó¤À¤¬¡¢¤³¤ì¤ò1.1.1g¤Ø¾å¤²¤ì¤Ð¡¢Ìµ»öÆ°¤¯¤è¤¦¤Ë¤Ê¤ë¡£¥Ð¡¼¥¸¥ç¥ó¥¢¥Ã¥×¤Ïapt¤Ç¤Ï¥À¥á¤Ç¡¢¥½¡¼¥¹¥³¡¼¥É¤«¤é¥³¥ó¥Ñ¥¤¥ë¤¹¤ëɬÍפ¬¤¢¤ë¤è¤¦¤Ê¤Î¤Ç¡Ê¾¯¤Ê¤¯¤È¤â»ä¤Ï¤½¤¦¤·¤¿¡Ë¥°¥°¤Ã¤Æ¤­¤Á¤ó¤ÈÄ´¤Ù¤Æ¤ä¤Ã¤Æ¤¯¤À¤µ¤¤¡£
¡¡¸å¤Ï½ÐÎÏ·ë²Ì¤òcsv¥Õ¥¡¥¤¥ë¤ØÅǤ­½Ð¤·¤Æ¡¢¤½¤ì¤òGoogle¥«¥ì¥ó¥À¡¼¤Ø¥¤¥ó¥Ý¡¼¥È¤¹¤ì¤ÐOK¡£

¥×¥íÌîµå¤ÎÆüÄø

¡¡¤³¤Á¤é¤âÊѤʥ¨¥é¡¼¤ËǺ¤Þ¤µ¤ì¤¿¤¬¡¢´Ä¶­¤Î¥¢¥Ã¥×¥Ç¡¼¥È¤Ç¤Ï¤Ê¤¯¡¢¥¹¥¯¥ê¥×¥È¤Î½¤Àµ¤Ç»ö­¤ê¤¿¡£

  • ǯ¤ò¡Ö2020¡×¤Ë¡¢Æü¤Ë¤Á¤òº£Ç¯¤ÎÊѧÆüÄø¤Ë±þ¤¸¤ÆÊѹ¹¤·¤¿¡£
  • SSLǧ¾Ú¤Ç¥¨¥é¡¼¤¬½Ð¤Ê¤¤¤è¤¦¡¢Âбþ¤òÄɵ­¡£
 #!/usr/bin/python3
 #coding: utf-8
 
 #scrapingnpb2.py
 
 import sys
 import re
 import datetime
 import pandas as pd
 import ssl
 
 ssl._create_default_https_context = ssl._create_unverified_context
 
 print("Subject, Start Date, Start Time, End Date, End Time, Description, Location")
 
 year = '2020'
 #months = ['03', '04', '05', '06', '07', '08', '09']
 months = ['06', '07', '08', '09', '10', '11']
 
 # 0,     1,              2,                 3,          4,   5
 #(0, '3/29¡Ê¶â¡Ë', 'DeNA -  ÃæÆü', '²£\u3000ÉÍ  18:30', nan, nan)
 
 for month in months:
     url = "http://npb.jp/games/" + year + "/schedule_" + month + "_detail.html"
     tb = pd.io.html.read_html(url)
     for row in tb[0].itertuples(name=None):
         card = ''
         md = re.sub(r'¡Ê.*¡Ë', '', row[1])
         ymd = year + '/' + md
         sttm = ''
         entm = ''
         place = ''
         if row[2] == row[2]:
             card = re.sub(' -  ', 'ÂÐ', row[2])
         if row[3] == row[3]:
             place_time = row[3].split('  ')
             if len(place_time) > 1:
                 (sthr, stmn) = place_time[1].split(':')
                 (mon, day) = md.split('/')
                 start = datetime.datetime(int(year), int(mon), int(day), int(sthr), int(stmn), 0)
                 delta = datetime.timedelta(minutes=200)
                 end = start + delta
                 sttm = start.strftime("%H:%M:%S")
                 entm = end.strftime("%H:%M:%S")
                 place = re.sub(r'\s+', '', place_time[0])
             else:
                 sttm = '18:00:00'
                 entm = '21:20:00'
                 place = place_time[0]
 
         if len(sys.argv) > 1:
             m = re.search(sys.argv[1], card)
             if m:
                 print(f"{card}, {ymd}, {sttm}, {ymd}, {entm}, {card}, {place}")
         elif card != '':
             print(f"{card}, {ymd}, {sttm}, {ymd}, {entm}, {card}, {place}")

¡¡Åö½é¤Ï°Ê²¼¤Î¤è¤¦¤Ê¥¨¥é¡¼¤ËǺ¤Þ¤µ¤ì¤¿¡£

urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1108)>

¡¡µîǯ¤È¤Î°ã¤¤¤Ï¡¢NPB¤Î¥µ¥¤¥È¤¬¡Öhttps¡×¤Ë¤Ê¤Ã¤Æ¤¤¤¿¤³¤È¡£
¡¡¤Ç¤â¡¢pandas¤Î¥µ¥¤¥È¤Ë¤è¤ë¤È¡¢°Ê²¼¤Î¤è¤¦¤Ëpandas.read_html¤Ï¡Ê¤È¤¤¤¦¤«lxml¤Ï¡Ëhttps¤Ë¤ÏÂбþ¤·¤Æ¤¤¤Ê¤¤¡£

pandas.read_html(io, match='.+', flavor=None, header=None, index_col=None, skiprows=None, attrs=None, parse_dates=False, thousands=',', encoding=None, decimal='.', converters=None, na_values=None, keep_default_na=True, displayed_only=True)
Read HTML tables into a list of DataFrame objects.
Parameters:io:str, path object or file-like object
A URL, a file-like object, or a raw string containing HTML. Note that lxml only accepts the http, ftp and file url protocols. If you have a URL that starts with 'https' you might try removing the 's'.

¡¡¤Ê¤Î¤Ç¡¢¥¹¥¯¥ê¥×¥È¾å¡Öhttp¡×¤Ë¥¢¥¯¥»¥¹¤·¤Æ¤¤¤ë¤Î¤Ï¤½¤ì¤Ï¤½¤ì¤ÇÀµ¤·¤¤¤Ï¤º¤Ê¤Î¤À¤¬¡¢¤Ê¤¼¤«SSLǧ¾Ú¥¨¥é¡¼¤Ë¤Ê¤Ã¤Æ¤·¤Þ¤¦¡Ê¥µ¥¤¥È¦¤Çhttp¤Ø¤Î¥¢¥¯¥»¥¹¤òhttps¤Ø¥ê¥À¥¤¥ì¥¯¥È¤·¤Æ¤ë¤¿¤á¤«¡©¡Ë¡£
¡¡¤Ç¡¢¤·¤«¤¿¤¬¤Ê¤¤¤Î¤Ç¡¢SSLǧ¾Ú¥¨¥é¡¼¤ò̵»ë¤¹¤ë¤è¤¦¤Ë¤·¤¿¼¡Âè¡£
¡¡¸å¤Ï¤³¤ì¤Þ¤¿½ÐÎÏ·ë²Ì¤òcsv¥Õ¥¡¥¤¥ë¤ØÅǤ­½Ð¤·¤Æ¡¢¤½¤ì¤òGoogle¥«¥ì¥ó¥À¡¼¤Ø¥¤¥ó¥Ý¡¼¥È¤¹¤ì¤ÐOK¡£


2020-6-29

Å´Æ»ÃÙ±ä¾ðÊó¤Î¼èÆÀÀè¤ÎÊѹ¹

¡¡

¡¡ËèÄ«¡¢¤½¤ÎÆü¤ÎͽÄê¤äÅ·µ¤Í½Êó¤Ê¤É¤òGoogle Home¤Ë¤·¤ã¤Ù¤é¤»¤Æ¤¤¤ë¡£
¡¡Ä̶Ф˻Ȥ¦Å´Æ»¤Î±¿¹ÔÃÙ±ä¾ðÊó¤â¤·¤ã¤Ù¤é¤»¤Æ¤¤¤ë¤Î¤À¤¬¡¢¤³¤³¿ôÆü¡Ê2020/06/25°Ê¹ß¡Ë¡¢¤Ê¤¼¤«¤·¤ã¤Ù¤é¤Ê¤¯¤Ê¤Ã¤Æ¤·¤Þ¤Ã¤¿¡£
¡¡¸¶°ø¤Ï¡¢°Ê²¼¤Î¡ÖÅ´Æ»ÃÙ±ä¾ðÊó¤Îjson¡×¤ÎAPI¤¬»ß¤Þ¤Ã¤Æ¤·¤Þ¤Ã¤¿¤«¤é¡£

¡¡Ä󶡸µ¤Î¥µ¥¤¥È¤Ë¤Ï²¿¤Î¥¢¥Ê¥¦¥ó¥¹¤â¤Ê¤¤¤¬¡¢¸µ¡¹Á±°Õ¤Ç±¿±Ä¤µ¤ì¤Æ¤¤¤¿¤â¤Î¤Ç¤¢¤ê¡¢¡Öͽ¹ð¤Ê¤¯ÃæÃǤ¹¤ë¤³¤È¤¬¤¢¤ê¤Þ¤¹¡×¤È¤Îµ­ºÜ¤â¤¢¤ë¤³¤È¤«¤é¡¢¤Þ¤¢¡¢»ÅÊý¤¬¤Ê¤¤¡£
¡¡º£¤Þ¤Ç¤¢¤ê¤¬¤È¤¦¤´¤¶¤¤¤Þ¤·¤¿¡£

¡¡¤â¤·¤«¤·¤¿¤é¡¢¤·¤Ð¤é¤¯ÂԤäƤ¤¤ì¤ÐÉü³è¤¹¤ë¤Î¤«¤â¤·¤ì¤Ê¤¤¤¬¡¢¤¤¤¤µ¡²ñ¤Ê¤Î¤Ç¡¢»ß¤Þ¤Ã¤Æ¤·¤Þ¤Ã¤¿API¤Î¥Í¥¿¸µ¤Ç¤¢¤ë¡ÖTetdudo.com¡ÊÅ´Æ»¥³¥à¡Ë¡×¤Î¡Ö±¿¹Ô¾ðÊó¥µ¥¤¥È¹¹¿·¾õ¶·¡×¤Î¥Ú¡¼¥¸¤Î¥Õ¥£¡¼¥É¡ÊAtom1.0·Á¼°¡Ë

¤«¤éÃÙ±ä¾ðÊó¤ò¼èÆÀ¤¹¤ë¤è¤¦¡¢Êѹ¹¤¹¤ë¤³¤È¤Ë¤·¤¿¡£

¡¡°Ê²¼¤ËGAS¡ÊGoogle Apps Script¡Ë¤Î³ºÅö¤Î´Ø¿ô¤ò¼¨¤¹¡£

½¤ÀµÁ°

 function trainDelay() {
   var url = "https://tetsudo.rti-giken.jp/free/delay.json";
   var response = UrlFetchApp.fetch(url); 
   var json = JSON.parse(response.getContentText()); 
 
   var text = "";
   for (var i in json) {
     if (json[i].name === "¿å·´Àþ" || json[i].name === "¼·ÈøÀþ" || json[i].name === "À¥¸ÍÂ綶Àþ") {
       text += json[i].name + "¡¢";
     }
   }
   
   if (!text) {
     text = "Å´Æ»¤Î±¿¹ÔÃÙ±ä¾ðÊó¤Ï¤¢¤ê¤Þ¤»¤ó¡£";
   } else {
     text += "¤¬Ãٱ䤷¤Æ¤¤¤Þ¤¹¡£";
   }
 
   return text;
 }

½¤Àµ¸å

 function trainDelay() {
   var url = 'http://api.tetsudo.com/traffic/atom.xml';
   var response = UrlFetchApp.fetch(url);
   var xml = XmlService.parse(response.getContentText());
   var ns = XmlService.getNamespace('', 'http://www.w3.org/2005/Atom');
   var items = xml.getRootElement().getChildren('entry', ns);
 
   var check = ['¿å·´Àþ', '¼·ÈøÀþ', 'À¥¸ÍÂ綶Àþ'];
 
   var text = "";
   for(var i in items) {
     title = items[i].getChild('title', ns).getText();
     for (var j in check) {
       reg = new RegExp(check[j]);
       if (title.match(reg)) {
         text += check[j] + "¡¢";
       }
     }
   }
 
   if (!text) {
     text = "Å´Æ»¤Î±¿¹ÔÃÙ±ä¾ðÊó¤Ï¤¢¤ê¤Þ¤»¤ó¡£";
   } else {
     text += "¤¬Ãٱ䤷¤Æ¤¤¤Þ¤¹¡£";
   }
 
   return text;
 }

¡¡Çº¤ó¤À¤Î¤Ï¡¢¤Ê¤¼¤«XML¤¬¤¦¤Þ¤¯¥Ñ¡¼¥¹¤Ç¤­¤Ê¤«¤Ã¤¿¤³¤È¡£¥°¥°¤Ã¤Æ¤ß¤Ä¤±¤¿¤³¤Á¤é¤Î¥Ú¡¼¥¸¤Îµ­ºÜ¤«¤é¡¢¤¤¤Á¤¤¤Á̾Á°¶õ´Ö¤ò»ØÄꤹ¤ëɬÍפ¬¤¢¤ë¤³¤È¤òÃΤ롣¤á¤ó¤É¤¯¤»¤¨¡£


2020-6-20

¼¼Æâ¤ÎCO2Ç»ÅÙ¤¬¸«¤¿¤¤

¡¡

Ãí°Õ

¡¡°Ê²¼¤Îµ­»ö¤Ïµ»½ÑŪ¤Ë¤Ï¤Û¤È¤ó¤ÉÌò¤ËΩ¤¿¤Ê¤¤¡£Çϼ¯¤ÊÉÔÃí°Õ¤¬¾·¤¯´ÖÈ´¤±¤ÊÅÌÏ«¤Ë½¼¤Á¤¿¼ºÇÔÃ̤Ǥ¢¤ë¡£
¡¡

LaMetric Time

¡¡¿ôǯÁ°¤ËÇã¤Ã¤¿LaMetric Time¤Ë¡¢¤¤¤í¤ó¤Ê¾ðÊó¤òɽ¼¨¤·¤Æ³Ú¤·¤ó¤Ç¤¤¤ë¡£

¡¡ÀµÄ¾¡¢¼ÂÍÑÀ­¤Ë¤Ï˳¤·¤¤µ¤¤â¤¹¤ë¤¬¡¢¤Þ¤¢¡¢Æø¤ä¤«¤·¤È¤¤¤¦¤³¤È¤Ç¡£

¡¡É½¼¨¤·¤Æ¤¤¤ë¤Î¤Ï¡¢½ç¤Ë¡¢

  • »þ¹ï
  • ÆüÉÕ ÍËÆü
  • ¶á½ê¤Î¥¢¥á¥À¥¹¤Î¾ðÊó
    • µ¤²¹
    • Å·µ¤¡Ê¥¢¥¤¥³¥ó¤ÇÀ²/ÆÞ/±«¤òÀڤ괹¤¨¡Ë¤È¹ß¿åÎÌ
    • É÷®
  • Nature Remo¤ÎÆ⢥»¥ó¥µ¡¼¤ÎÃÍ
    • ¼¼²¹
    • ¼¾ÅÙ

¡¡ºÇ¶áÉô²°¤ËäƤâ¤ëµ¡²ñ¤¬Áý¤¨¡¢°ÊÁ°¤Ë¤âÁý¤·¤Æ¼¼Æâ´Ä¶­¤òµ¤¤Ë¤¹¤ë¤è¤¦¤Ë¤Ê¤Ã¤¿¤Î¤À¤¬¡¢¤³¤ì¤Þ¤¿²¿Ç¯¤âÁ°¤ËÇã¤Ã¤¿Withings¤ÎÂνŷ×WS-50¡¢

¤³¤¤¤Ä¤ËCO2Ç»ÅÙ¥»¥ó¥µ¡¼¤¬Æ⢤µ¤ì¤Æ¤¤¤ë¤³¤È¤ò»×¤¤½Ð¤·¤¿¡£¼ê»ý¤Á¤ÎWithings¤Îµ¡´ï¤Î¾ðÊ󤬱ÜÍ÷¤Ç¤­¤ë¥µ¥¤¥È¤«¤é¤â¡¢°Ê²¼¤Î¤è¤¦¤ËCO2Ç»Å٤Υ°¥é¥Õ¤¬³Îǧ¤Ç¤­¤ë¡£

¡¡
¡¡¤³¤ÎCO2Ç»ÅÙ¤òLaMetric Time¤Ëɽ¼¨¤·¤¿¤¤¡ª
¡¡
¡¡¼¼Æâ¤ÎCO2Ç»ÅÙ¤¬1000ppm¤ò±Û¤¨¤ë¤È¡¢¿Í¤Ï̲µ¤¤ò³Ð¤¨ºî¶È¸úΨ¤¬Â礭¤¯²¼¤¬¤ë¤È¤¤¤¦¡£
¡¡¥°¥é¥Õ¤ò¸«¤ë¤ÈCO2Ç»ÅÙ¤Ï30ʬ¤Ë°ìÅÙ¬Äꤵ¤ì¤Æ¤¤¤ë¤è¤¦¤Ê¤Î¤Ç¡¢¤³¤ì¤òLaMetric Time¤Ë¾ï»þɽ¼¨¤·¡¢¤µ¤é¤Ë1000ppm¤ò±Û¤¨¤¿¤éÀÖ¤¯É½¼¨¤¹¤ë¤Ê¤É¤·¤Æ´¹µ¤¤ò¤¦¤Ê¤¬¤¹¡£¤½¤¦¤¤¤¦»ÅÁȤߤò¹½ÃÛ¤·¤¿¤¤¤È¹Í¤¨¤¿¡£
¡¡

Withings API(OAUTH 2.0)

¡¡¥°¥°¤Ã¤Æ¤ß¤ë¤È¡¢Withings¤¬µ¡´ï¤Î¾ðÊó¤ò¼è¤ì¤ëAPI¤òÄ󶡤·¤Æ¤¤¤ë¤³¤È¤¬¤ï¤«¤Ã¤¿¡£

¡¡»È¤¤Êý¤Ï¡¢°Ê²¼¤Î¥µ¥¤¥È¤Ç¾ÜºÙ¤ËÀâÌÀ¤µ¤ì¤Æ¤¤¤ë¡£

¡¡¾åµ­¤Î¥Ú¡¼¥¸¤Î¼ê½ç¤É¤ª¤ê¿Ê¤á¤Æ¡¢WS-50¤Î¾ðÊó¤ò¼è¤ë¤³¤È¤ËÀ®¸ù¤·¤¿¡£
¡¡¤·¤«¤·¡¢³Î¤«¤ËÂνŤʤ󤫤ξðÊó¤Ï¼è¤ì¤ë¤ó¤À¤¬¡¢CO2Ç»Å٤μè¤êÊý¤¬¤ï¤«¤é¤Ê¤¤¡£
¡¡»ö¤³¤³¤Ë»ê¤ê¤è¤¦¤ä¤¯µ¤¤Å¤¤¤¿¤Î¤À¤¬¡¢¼è¤ì¤ë¾ðÊó¤ÎÃæ¤ËCO2Ç»ÅÙ¤¬¤Ê¤¤¤Î¤Ç¤¢¤ë¡ª Withings API¤Î²òÀ⥵¥¤¥È¤Î¤³¤Á¤é¤Î¥Ú¡¼¥¸¤Ë·Ç¤²¤é¤ì¤Æ¤¤¤ë¡¢¼è¤ì¤ë¾ðÊó°ìÍ÷¤Îɽ¤ò¤è¤¯¤è¤¯¸«¤Æ¤ß¤ë¤È¡¢CO2Ç»Å٤ιàÌܤϤʤ¤¡£
¡¡¤Ê¤ª¡¢¤É¤Î¤ß¤ÁCO2Ç»Å٤ξðÊ󤬼è¤ì¤Ê¤¤¤Î¤Ç¡¢»ä¤Ë¤È¤Ã¤Æ¤Ï¤É¤¦¤Ç¤â¤¤¤¤¤Î¤À¤¬¡¢¤³¤ÎAPI¤ò»È¤¦¤Ë¤Ï¥¢¥¯¥»¥¹¥È¡¼¥¯¥ó¤¬É¬Íפǡ¢¤³¤ì¤¬30ʬ¤Ç´ü¸Â¤¬ÀÚ¤ì¤Æ¤·¤Þ¤¤¡¢¤½¤Î¤¿¤Ó¤ËºÆȯ¹Ô¤¹¤ëɬÍפ¬¤¢¤ë¡£¤³¤ì¤¬²¿µ¤¤ËÌÌÅݤ¯¤µ¤¯¤Æ¤«¤Ê¤ê»È¤¤¤Ë¤¯¤¤¡£
¡¡

Withings WS-50 Scale Syncer - Temperature & CO2

¡¡¤¢¤­¤é¤á¤­¤ì¤º¤Ë¤Ê¤ª¤â¥°¥°¤Ã¤Æ¤¤¤ë¤È¡¢¤Þ¤µ¤Ë»ä¤ÎÍÑÅӤˤФäÁ¤ê¤È»×¤¨¤ë¥Ú¡¼¥¸¤ò¸«¤Ä¤±¤¿¡£

¡¡¾åµ­¤Î¥Ú¡¼¥¸¤Ç¤Ï¡¢Withings WS-50 Scale Syncer - Temperature & CO2¤È¤¤¤¦¡¢Withings¤ÎÈó¸ø¼°¤ÊAPI¤ò»È¤Ã¤ÆWS-50¤Îµ¤²¹¤ÈCO2Ç»Å٤Υ»¥ó¥µ¡¼¤ÎÃͤò¼è¤ê¡¢Domoticz¤È¤¤¤¦OSS¤Î¥Û¡¼¥à¥ª¡¼¥È¥á¡¼¥·¥ç¥ó¥·¥¹¥Æ¥à¤ÈÏ¢·È¤µ¤»¤ë¥Ä¡¼¥ë¤ò¤½¤Î¤Þ¤ÞήÍѤ·¤Æ¤¤¤ë¤Î¤À¤¬¡¢¤³¤Á¤È¤éWS-50¤Î¥Ç¡¼¥¿¤µ¤¨¼è¤ì¤ì¤Ð¤½¤ì¤Ç¤¤¤¤¤Î¤Ç¡¢¤³¤Î¥Ä¡¼¥ë¤ò½ñ¤­´¹¤¨¤ë¤³¤È¤Ë¤·¤¿¡£

 #!/usr/bin/env python3
 # -*- coding: utf-8 -*-
 
 from datetime import datetime
 import sys
 import time
 import hashlib
 import requests
 
 TMPID = 12
 CO2ID = 35
 
 NOW = int(time.time())
 PDAY = NOW - (60 * 60 * 24)
 
 HEADER = {'user-agent': 'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2228.0 Safari/537.36'}
 
 URL_BASE = "https://scalews.withings.net/cgi-bin"
 URL_AUTH = URL_BASE + "/auth?action=login&appliver=3000201&apppfm=android&appname=wiscaleNG&callctx=foreground"
 URL_ASSO = URL_BASE + "/association?action=getbyaccountid&enrich=t&appliver=3000201&apppfm=android&appname=wiscaleNG&callctx=foreground&sessionid="
 URL_USAGE = "https://goo.gl/z6NNlH"
 
 
 def authenticate_withings(username, password):
     global pem
     try:
         import certifi
         pem = certifi.old_where()
     except Exception:
         pem = True
     requests.head(URL_USAGE, timeout=3, headers=HEADER, allow_redirects=True, verify=pem)
     payload = {'email': username, 'hash': hashlib.md5(password.encode('utf-8')).hexdigest(), 'duration': '900'}
     response = requests.post(URL_AUTH, data=payload)
     iddata = response.json()
     sessionkey = iddata['body']['sessionid']
     response = requests.get(URL_ASSO + sessionkey)
     iddata = response.json()
     deviceid = iddata['body']['associations'][0]['deviceid']
     return deviceid, sessionkey
 
 
 def download_data(deviceid, sessionkey, mtype, lastdate):
     payload = '/v2/measure?action=getmeashf&deviceid=' + str(deviceid) + '&meastype=' + str(mtype) + '&startdate=' + str(lastdate) + '&enddate=' + str(NOW) + \
         '&appliver=3000201&apppfm=android&appname=wiscaleNG&callctx=foreground&sessionid=' + str(sessionkey)
     try:
         response = requests.get(URL_BASE + payload)
     except Exception:
         sys.exit("[-] Data download failed, exiting" + "\n")
     dataset = response.json()
     return dataset
 
 
 def main():
     username = 'mail@address'
     password = 'password'
 
     deviceid, sessionkey = authenticate_withings(username, password)
     co2data = download_data(deviceid, sessionkey, CO2ID, PDAY)
 
     for item in sorted(co2data['body']['series'][0]['data'], key=lambda x:x['date'], reverse=True):
         dt = datetime.fromtimestamp(item['date'])
         print(f"date:{dt}, co2:{item['value']}")
 
     return
 
 if __name__ == "__main__":
     main()

¡¡¾åµ­¤Î¥¹¥¯¥ê¥×¥È¤Ç¤Ï¥Ç¥Ð¥Ã¥°ÍѤˡ¢Â¬ÄêÆü»þ¤È¤½¤Î»þ¤ÎCO2Ç»ÅÙ¤ò°ì¹Ô¤Ë¡¢°ìÆüʬ¤ò¿·¤·¤¤½ç¤Ëɽ¼¨¤·¤Æ¤¤¤ë¡£

date:2020-06-20 13:30:13, co2:509
date:2020-06-20 13:00:12, co2:519
date:2020-06-20 12:30:13, co2:506
date:2020-06-20 12:00:12, co2:497
date:2020-06-20 11:30:13, co2:509
¡Ä¡Ä

¡¡Åö½é¤Ï¤³¤Î¬ÄêÆü»þ¤ò¡¢¼è¤ì¤¿¥Ç¡¼¥¿¤½¤Î¤Þ¤Þ¤Îunixtime¤Çɽ¼¨¤·¤Æ¤¤¤¿¤¿¤á¡¢¤Ê¤«¤Ê¤«µ¤¤Å¤«¤Ê¤«¤Ã¤¿¤Î¤À¤¬¡¢¤³¤Î¥¹¥¯¥ê¥×¥È¤ò¼Â¹Ô¤·¤Æ¤â¡¢ºÇ¿·¤Î¥Ç¡¼¥¿¤È¤·¤Æ¼è¤ì¤ë¤Î¤Ï¡¢¼Â¹Ô»þ¤Î²¿»þ´Ö¤âÁ°¤Î¤â¤Î¤À¤Ã¤¿¤Î¤Ç¤¢¤ë¡£
¡¡¤³¤ì¤Ï°ìÂΤʤ¼¤Ê¤Î¤«¡©
¡¡»ö¤³¤³¤Ë»ê¤ê¤è¤¦¤ä¤¯µ¤¤Å¤¤¤¿¤Î¤À¤¬¡¢Á°½Ò¤Î¡ÖWithings WS-50¤ò»È¤Ã¤ÆCO2Ç»ÅÙ¤òCloudWatch¥á¥È¥ê¥¯¥¹¤ËÊݸ¤¹¤ë¡×¤Î¥Ú¡¼¥¸¤Ë¡¢¤½¤ì¤âËÁƬÉôʬ¤Ë¤·¤Ã¤«¤ê¤È¡¢°Ê²¼¤Î¤è¤¦¤Ë½ñ¤¤¤Æ¤¢¤ë¡£

¤¿¤À¤·¡¢WS-50¤Î¾ì¹ç¥»¥ó¥µ¾ðÊó¤ÎÁ÷¿®¤¬°Ê²¼¤Î2¥Ñ¥¿¡¼¥ó¤·¤«¤Ê¤¤¤¿¤á¡¢¥ê¥¢¥ë¥¿¥¤¥à¤ÇCO2Ç»ÅÙ¤ò¼èÆÀ¤¹¤ë¤³¤È¤Ï¤Ç¤­¤Ê¤¤¡£
¡¦1Æü°ì²ó¤É¤³¤«¤Î¥¿¥¤¥ß¥ó¥°¤Ç´Ä¶­¥»¥ó¥µ¾ðÊó¡ÊCO2Ç»ÅÙ¡¢²¹Å١ˤòÁ÷¿®
¡¦ÂνŤò¬¤Ã¤¿¥¿¥¤¥ß¥ó¥°¤Ç´Ä¶­¥»¥ó¥µ¾ðÊó¤âÁ÷¿®
¤Ê¤ª¡¢CO2Ç»ÅÙ¾ðÊó¤Ï30ʬ¤ª¤­¤Ëµ­Ï¿¤µ¤ì¤Æ¤¤¤ë¡£

¡¡¤Ä¤Þ¤ê¡¢CO2Ç»Å٤άÄ꼫ÂΤÏ30ʬËè¤Ë¹Ô¤ï¤ì¤Æ¤¤¤ë¤Î¤À¤¬¡¢¤½¤ì¤¬¥¦¥§¥Ö¾å¤Ë¥¢¥Ã¥×¥í¡¼¥É¤µ¤ì¤ë¤Î¤ÏÂνŬÄê»þ¤«°ìÆü°ìÅ٤Τɤ³¤«¤Î¥¿¥¤¥ß¥ó¥°¤Î¤ß¡£¤¤¤¯¤é¾ðÊó¤ò¼è¤ê¤Ë¹Ô¤Ã¤Æ¤â¡¢¼è¤ì¤ë¾ðÊ󤬹¹¿·¤µ¤ì¤ë¤Î¤Ï°ìÆü°ìÅÙ¤À¤±¡£¾åµ­¥¹¥¯¥ê¥×¥È¤Çɽ¼¨¤µ¤ì¤ë¤Î¤Ï¡¢Á°²ó¤Î¹¹¿·»þ¤Î¾ðÊ󤬺ǿ·¤Ë¤Ê¤ë¤ï¤±¤Ç¡¢¤½¤ì¤¬¥¹¥¯¥ê¥×¥È¼Â¹Ô»þ¤Î²¿»þ´Ö¤âÁ°¤À¤Ã¤¿¤Î¤Ç¤¢¤ë¡£

¡¡·ë¶É¡¢WS-50Æ⢤ÎCO2Ç»ÅÙ¥»¥ó¥µ¡¼¤Ï¡¢»ä¤¬ÁÛÄꤷ¤¿ÍÑÅӤǤϻȤ¨¤Ê¤¤¤È¤¤¤¦¤³¤È¤¬¤ï¤«¤Ã¤¿¡£
¡¡

MH-Z19

¡¡¤½¤ì¤Ç¤â¤¢¤­¤é¤á¤­¤ì¤Ê¤¤»ä¤¬¤É¤¦¤·¤¿¤«¤È¤¤¤¦¤È¡¢Banggood.com¤Ç°Ê²¼¤ÎCO2Ç»ÅÙ¥»¥ó¥µ¡¼MH-Z19¤òȯÃí¤·¤Æ¤·¤Þ¤Ã¤¿¡£

¡¡¤Ê¤¼Amazon¤Ç¤Ï¤Ê¤¯Banggood¤ÇÇã¤Ã¤¿¤«¤È¤¤¤¦¤È¡¢Banggood¤ÎÊý¤¬ÃÍÃʤâ°Â¤¯Ç¼´ü¤âû¤«¤Ã¤¿¤«¤é¡£¤½¤ì¤Ç¤âǼ´ü¤Ï10Æü¤«¤é30Æü¸å¡£ÃÍÃʤÏÁ÷ÎÁ¹þ¤ß¤Ç2529±ß¡£
¡¡CO2Ç»ÅÙ¥»¥ó¥µ¡¼¤¬ÆϤ¯º¢¤Þ¤Ç¡¢¤³¤Î¾ðÇ®¤¬¼º¤ï¤ì¤Æ¤¤¤Ê¤±¤ì¤ÐÎɤ¤¤¬¡£



¤³¤ì¤è¤êÁ°¤Î5Æüʬ