Presentations at WCCS24

Masa Funabashi and Taka Sasaki presented the following papers at the 5th World Conference on Complex Systems, which was held on November 11th and 12th, 2024, in Casablanca, Morocco.

Takahiro Sasaki, Godai Suzuki and Masatoshi Funabashi “Integrated Stock and Flow Dynamics for Comprehensive Carbon Offset Evaluation” IEEE Proceedings of the 2024 World Conference on Complex Systems (WCCS), November 11-14, 2024, Mohammedia, Morocco

Masatoshi Funabashi, Kei Aria Nonaka and Tomoyuki Minami “Scale-Free Correction of Under-/Over-Reported Biases in Global Biotic Interaction Network” IEEE Proceedings of the 2024 World Conference on Complex Systems (WCCS), November 11-14, 2024, Mohammedia, Morocco

The abstract book is available here.

Link to the IEEE Proceeding of the 2024 World Conference on Complex Systems (WCCS)

Presentation at IEEE RO-MAN 2024

The following paper was presented at the 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN) held in August 26-30, 2024, at Pasadena, CA, USA.

Locating the Fruit to Be Harvested and Estimating Cut Positions from RGBD Images Acquired by a Camera Moved Along Fixed Paths Using a Mask-R-CNN Based Method
Zhao, Wentao (Waseda University), Otani, Takuya (Shibaura Institute of Technology), Sugiyama, Soma (Waseda University), Mitani, Kento (Waseda University), Masaya, Koki (Waseda University), Takanishi, Atsuo (Waseda University), Aotake, Shuntaro (Waseda University), Funabashi, Masatoshi (SonyCSL/Kyoto University), Ohya, Jun (Waseda University)
Keywords: Degrees of Autonomy and TeleoperationMachine Learning and Adaptation
Abstract: Compared to traditional agricultural environments, the high density and diversity of vegetation layouts in Synecoculture farms present significant challenges in locating and harvesting occluded fruits and pedicels (cutting points). To address this challenge, this study proposes a Mask R-CNN-based method for locating fruits (tomatoes, yellow bell peppers, etc.) and estimating the pedicels from RGBD images acquired by a camera moved along fixed paths. After obtaining masks of all fruits and pedicels, this method judges the matching relationship between the located fruit and pedicel according to the 3D distance between the fruit and pedicel. Subsequently, this research determines the least occluded best viewpoint for harvesting based on the visible real areas of located fruits in images acquired under the fixed paths, and harvesting is then completed from this best viewpoint following a straight path. Experimental results show this method effectively identifies occluded targets and their cutting positions in both Gazebo simulation environments and real-world farms. This method can select the least occluded viewpoint for a high harvesting success rate.

Presentations at CCE’23

Masatoshi Funabashi , Sony CSL senior reseacher, presented two research papers at the Complex Computational Ecosystems 2023 in Baku, Azerbaijan, and one of them received the Best Non-Student Presentation Award.

■ [Best Non-Student Presentation Award ] “Vegee Brain Automata: Ultradiscretization of essential chaos transversal in neural and ecosystem dynamics” (Masatoshi Funabashi)

Get the final version manuscript here.

■“Modeling ecosystem management based on the integration of image analysis and human subjective evaluation – Case studies with synecological farming” (Shuntaro Aotake, Atsuo Takanishi, Masatoshi Funabashi

Get the final version manuscript here.

Both papers have been published in Springer Lecture Notes in Computation Science.

日本ロボット学会誌に論文発表

日本ロボット学会誌40 巻 (2022) 9 号に、以下の論文が公開されました。

協生農法環境における農作業支援ロボットの開発—第1報:圃場移動の開発および剪定・収穫動作の実現—田中 大雅, 政谷 巧樹, 寺江 航汰, 水上 英紀, 村上 将嗣, 吉田 駿也, 青竹 峻太郎, 舩橋 真俊, 大谷 拓也, 高西 淳夫

(English Title: Development of the Agricultural Robot in SynecocultureTM Environment —1st Report, Development of Moving Mechanism on the Farm and Realization of Weeding and Harvesting—)

Opinion Article in BiodiverCities by 2030

An opinion article on Synecoculture and urban augmented ecosystems was published in the book “BiodiverCities by 2030: Transforming Cities with Biodiversity”:

Funabashi, M. Living in a Hotspot of City and Biodiversity. The Case of Synecoculture. P. 252-253. In: Mejía, M.A., Amaya-Espinel, J.D. (eds.). BiodiverCities by 2030: Transforming Cities with Biodiversity. Bogotá. Instituto de Investigación de Recursos Biológicos Alexander von Humboldt. 2022. 288 pages.

第49回画像電子学会年次大会にて発表

Sony CSLより青竹峻太郎・舩橋真俊が参画した、協生農法に関する早稲田大学との共同研究について、以下の通り学会発表を行いました。

2021 年度 第 49 回画像電子学会年次大会
Media Computing Conference 2021
https://www.iieej.org/annualconf/2021nenji-top/

6月26日(土)

●16:00-16:45 学生セッション(S7) 3件 「動物」 (15分/件) 座長:森谷友昭(東京電機大学)

【S7-3】協生農法環境におけるRGB画像からの圃場の優勢植生の深層学習を用いる検出方法に関する研究
 〇征矢寛汰(早稲田大学) , 青竹峻太郎(早稲田大学/ソニーコンヒ゜ュータサイエンス研究所), 小方博之(早稲田大学/成蹊大学), 大谷 淳, 大谷拓也, 高西淳夫(早稲田大学), 舩橋真俊(ソニーコンヒ゜ュータサイエンス研究所)

●16:55-17:40 学生セッション(S8) 3件 「動物」 (15分/件) 座長:小林直樹(埼玉医科大学)

【S8-2】協生農法環境におけるRGB画像に対するSemantic Segmentationを用いた圃場の被覆状態の認識方法に関する研究
 〇吉崎玲奈(早稲田大学) , 青竹峻太郎(早稲田大学/ソニーコンヒ゜ュータサイエンス研究所), 小方博之(早稲田大学/成蹊大学), 大谷 淳, 大谷拓也, 高西淳夫(早稲田大学), 舩橋真俊(ソニーコンヒ゜ュータサイエンス研究所)

——————————

English :

The Institute of Image Electronics Engineers of Japan(IIEEJ)
Media Computing Conference 2021

6/26

Student Session(S7) “Animal Session”

●【S7-3】 Study of a method for detecting dominant vegetation in a field from RGB images using deep learning in Synecoculture environment

        〇Kanta SOYA*1, Shuntaro AOTAKE*2/*3, Hiroyuki Ogata*4/*5, Jun OHYA*6, Takuya OHTANI*7, Atsuo TAKANISHI*6, Masatoshi FUNABASHI*3

        *1 Dept. of Modern Mechanical Engineering, WASEDA University, *2 Dept. of Advanced Science and Engineering, WASEDA University, *3 Sony Computer Science Laboratories, Inc, *4 Waseda University, Future Robotics Organization, *5 Dept. of System Design, SEIKEI University, *6 Dept. of Modern Mechanical Engineering, WASEDA University, *7 Waseda Research Institute for Science and Engineering

Student Session(S8) ”Animal Session”

●【S8-2】 Study of a Method for Recognizing Field Covering Situation by Applying Semantic Segmentation to RGB Images in Synecoculture Environment

        〇Reina YOSHIZAKI1,Shuntaro AOTAKE2,3,Hiroyuki OGATA4,5,Jun OHYA1,Takuya OHTANI6,Atsuo TAKANISHI1,Masatoshi FUNABASHI3

        1 Graduate School of Creative Science and Engineering, Waseda University,2 Graduate School of Advanced Science and Engineering, Waseda University,3 Sony Computer Science Laboratories, Inc.,4 Faculty of Science and Technology, Seikei University,5 Waseda University, Future Robotics Organization,6 Waseda Research Institute for Science and Engineering