Te Whakaaetanga Doodle Tere: me pehea te whakahoahoa ki a R, C++ me nga whatunga neural

Te Whakaaetanga Doodle Tere: me pehea te whakahoahoa ki a R, C++ me nga whatunga neural

Hei Habr!

I te ngahuru kua hipa, i whakahaerehia e Kaggle tetahi whakataetae ki te whakarōpū i nga pikitia i tohia-a-ringatia, Tere Draw Doodle Recognition, i roto i era atu, i uru mai he roopu R-scientists: Artem Klevtsova, Piripi Kaiwhakahaere и Andrey Ogurtsov. E kore matou e whakaahua taipitopito i te whakataetae; kua oti kee tera te whakaputanga tata.

I tenei wa kaore i pai ki te mahi ahuwhenua mētara, engari he nui nga wheako whai hua i riro mai, no reira e hiahia ana ahau ki te korero ki te hapori mo etahi o nga mea tino pai me te whai hua mo Kagle me nga mahi o ia ra. I roto i nga kaupapa i korerohia: te oranga uaua me te kore OpenCV, JSON parsing (enei tauira ka tirotiro i te whakaurunga o te waehere C++ ki roto i nga tuhinga tuhipoka ranei i roto i te R ma te whakamahi Rcpp), te tawhā o ngā hōtuhi me te dockerization o te otinga whakamutunga. Ko nga waehere katoa mai i te karere i roto i te ahua e tika ana mo te mahi e waatea ana i roto whare putunga.

Ngā tirotiro:

  1. Kia pai te uta i nga raraunga mai i CSV ki MonetDB
  2. Te whakarite puranga
  3. Kaitohu mo te tango i nga puranga mai i te paataka raraunga
  4. Te Whiriwhiringa i te Hanganga Tauira
  5. Te tawhā hōtuhi
  6. Dockerization o hōtuhi
  7. Te whakamahi i nga GPU maha i runga i te Google Cloud
  8. Engari o te mutunga

1. Kia pai te uta i nga raraunga mai i te CSV ki roto i te paataka raraunga MonetDB

Ko nga raraunga o tenei whakataetae kaore i te ahua o nga whakaahua kua rite, engari i te ahua o nga konae CSV 340 (kotahi te konae mo ia akomanga) kei roto nga JSON me nga tohu tohu. Ma te hono i enei tohu ki nga raina, ka whiwhi tatou i te ahua whakamutunga 256x256 pika. Ano hoki mo ia rekoata he tapanga e whakaatu ana mena i tika te mohio o te pikitia e te kaitaka i whakamahia i te wa i kohia ai te huingararaunga, he waehere reta-rua o te whenua noho o te kaituhi o te pikitia, he tohu motuhake, he tohu wa. me tetahi ingoa akomanga e rite ana ki te ingoa kōnae. Ko te putanga ngawari o nga raraunga taketake he 7.4 GB te taumaha i roto i te puranga me te tata ki te 20 GB i muri i te wetewete, ko nga raraunga katoa i muri i te wetewete he 240 GB. I whakarite nga kaiwhakarite i nga putanga e rua i rite nga pikitia, ko te tikanga he koretake te putanga katoa. Ahakoa he aha te mea, ko te penapena i te 50 miriona whakaahua ki roto i nga konae whakairoiro, i te ahua o nga raupapatanga ranei ka kiia he kore hua, ka whakatauhia e matou ki te hanumi i nga konae CSV katoa mai i te puranga. train_simplified.zip ki roto i te putunga raraunga me nga reanga o muri mai o nga whakaahua o te rahi e hiahiatia ana "i runga i te rere" mo ia puranga.

I whiriwhiria he punaha kua whakamatauria hei DBMS MonetDB, ara he whakatinanatanga mo R hei kete MonetDBLite. Kei roto i te kete he putanga whakauru o te tūmau pātengi raraunga, ā, ka taea e koe te tiki tika i te tūmau mai i te wātū R ka mahi ki reira. Ko te hanga i tetahi papaa raraunga me te hono atu ki a ia ka mahia ma te whakahau kotahi:

con <- DBI::dbConnect(drv = MonetDBLite::MonetDBLite(), Sys.getenv("DBDIR"))

Me hanga e matou nga ripanga e rua: tetahi mo nga raraunga katoa, tetahi mo nga korero ratonga mo nga konae kua tangohia (he whai hua mena ka he te mahi me te mahi ano i muri i te tango i nga konae maha):

Te hanga ripanga

if (!DBI::dbExistsTable(con, "doodles")) {
  DBI::dbCreateTable(
    con = con,
    name = "doodles",
    fields = c(
      "countrycode" = "char(2)",
      "drawing" = "text",
      "key_id" = "bigint",
      "recognized" = "bool",
      "timestamp" = "timestamp",
      "word" = "text"
    )
  )
}

if (!DBI::dbExistsTable(con, "upload_log")) {
  DBI::dbCreateTable(
    con = con,
    name = "upload_log",
    fields = c(
      "id" = "serial",
      "file_name" = "text UNIQUE",
      "uploaded" = "bool DEFAULT false"
    )
  )
}

Ko te huarahi tere ki te uta i nga raraunga ki roto i te papanga raraunga ko te kape tika i nga konae CSV ma te whakamahi i te SQL - whakahau COPY OFFSET 2 INTO tablename FROM path USING DELIMITERS ',','n','"' NULL AS '' BEST EFFORTte wahi tablename - ingoa tepu me path - te ara ki te kōnae. I te wa e mahi tahi ana me te puranga, i kitea ko te whakatinanatanga i roto unzip i roto i te R kaore e mahi tika me te maha o nga konae mai i te puranga, na reira i whakamahia e matou te punaha unzip (ma te whakamahi i te tawhā getOption("unzip")).

Taumahi mo te tuhi ki te patengi raraunga

#' @title Извлечение и загрузка файлов
#'
#' @description
#' Извлечение CSV-файлов из ZIP-архива и загрузка их в базу данных
#'
#' @param con Объект подключения к базе данных (класс `MonetDBEmbeddedConnection`).
#' @param tablename Название таблицы в базе данных.
#' @oaram zipfile Путь к ZIP-архиву.
#' @oaram filename Имя файла внури ZIP-архива.
#' @param preprocess Функция предобработки, которая будет применена извлечённому файлу.
#'   Должна принимать один аргумент `data` (объект `data.table`).
#'
#' @return `TRUE`.
#'
upload_file <- function(con, tablename, zipfile, filename, preprocess = NULL) {
  # Проверка аргументов
  checkmate::assert_class(con, "MonetDBEmbeddedConnection")
  checkmate::assert_string(tablename)
  checkmate::assert_string(filename)
  checkmate::assert_true(DBI::dbExistsTable(con, tablename))
  checkmate::assert_file_exists(zipfile, access = "r", extension = "zip")
  checkmate::assert_function(preprocess, args = c("data"), null.ok = TRUE)

  # Извлечение файла
  path <- file.path(tempdir(), filename)
  unzip(zipfile, files = filename, exdir = tempdir(), 
        junkpaths = TRUE, unzip = getOption("unzip"))
  on.exit(unlink(file.path(path)))

  # Применяем функция предобработки
  if (!is.null(preprocess)) {
    .data <- data.table::fread(file = path)
    .data <- preprocess(data = .data)
    data.table::fwrite(x = .data, file = path, append = FALSE)
    rm(.data)
  }

  # Запрос к БД на импорт CSV
  sql <- sprintf(
    "COPY OFFSET 2 INTO %s FROM '%s' USING DELIMITERS ',','n','"' NULL AS '' BEST EFFORT",
    tablename, path
  )
  # Выполнение запроса к БД
  DBI::dbExecute(con, sql)

  # Добавление записи об успешной загрузке в служебную таблицу
  DBI::dbExecute(con, sprintf("INSERT INTO upload_log(file_name, uploaded) VALUES('%s', true)",
                              filename))

  return(invisible(TRUE))
}

Mena ka hiahia koe ki te huri i te ripanga i mua i te tuhi ki te papaunga raraunga, he nui noa te tuku i te tohenga preprocess mahi ka huri i nga raraunga.

Waehere mo te uta-a-raupapa-raupapa raraunga ki roto i te patengi raraunga:

Te tuhi raraunga ki te papaunga raraunga

# Список файлов для записи
files <- unzip(zipfile, list = TRUE)$Name

# Список исключений, если часть файлов уже была загружена
to_skip <- DBI::dbGetQuery(con, "SELECT file_name FROM upload_log")[[1L]]
files <- setdiff(files, to_skip)

if (length(files) > 0L) {
  # Запускаем таймер
  tictoc::tic()
  # Прогресс бар
  pb <- txtProgressBar(min = 0L, max = length(files), style = 3)
  for (i in seq_along(files)) {
    upload_file(con = con, tablename = "doodles", 
                zipfile = zipfile, filename = files[i])
    setTxtProgressBar(pb, i)
  }
  close(pb)
  # Останавливаем таймер
  tictoc::toc()
}

# 526.141 sec elapsed - копирование SSD->SSD
# 558.879 sec elapsed - копирование USB->SSD

Ka rereke pea te wa uta raraunga i runga i nga ahuatanga tere o te puku i whakamahia. I a maatau, ko te panui me te tuhi i roto i te SSD kotahi, mai i te puku kohiko (puna konae) ki te SSD (DB) iti iho i te 10 meneti.

He ruarua noa nei te roa hei hanga i tetahi pou me te tapanga karaehe tauoti me te pou taupū (ORDERED INDEX) me nga nama raina e whakatauirahia ai nga tirohanga i te wa e hanga ana nga roopu:

Te Waihanga Tiwae Tāpiri me te Taupū

message("Generate lables")
invisible(DBI::dbExecute(con, "ALTER TABLE doodles ADD label_int int"))
invisible(DBI::dbExecute(con, "UPDATE doodles SET label_int = dense_rank() OVER (ORDER BY word) - 1"))

message("Generate row numbers")
invisible(DBI::dbExecute(con, "ALTER TABLE doodles ADD id serial"))
invisible(DBI::dbExecute(con, "CREATE ORDERED INDEX doodles_id_ord_idx ON doodles(id)"))

Hei whakaoti i te raruraru o te hanga i te puranga i runga i te rere, me tutuki te tere morahi o te tango rarangi matapōkere mai i te ripanga. doodles. Mo tenei i whakamahia e matou e 3 nga tinihanga. Ko te tuatahi ko te whakaiti i te rahinga o te momo e pupuri ana i te ID tirohanga. I roto i te huinga raraunga taketake, ko te momo e hiahiatia ana hei penapena i te ID bigint, engari na te maha o nga tirohanga ka taea te whakauru i o raatau tohu, rite ki te tau ordinal, ki te momo int. He tere ake te rapu i tenei keehi. Ko te mahi tuarua ko te whakamahi ORDERED INDEX — I tae mai matou ki tenei whakatau i runga i nga tikanga katoa, kua paahitia e matou nga mea katoa e waatea ana kōwhiringa. Ko te tuatoru ko te whakamahi i nga uiui tawhā. Ko te ngako o te tikanga ko te whakahaere i te whakahau kotahi PREPARE me te whakamahi i muri mai i tetahi korero kua rite i te wa e hanga ana i nga momo paatai ​​​​o taua momo, engari ko te mea he painga ki te whakataurite ki te mea ngawari. SELECT i puta mai i roto i te awhe o te hapa tauanga.

Ko te tukanga o te tuku raraunga ka pau i te 450 MB RAM. Arā, ko te huarahi kua whakaahuatia ka taea e koe te nuku i nga huingararaunga tekau kikipaita te taumaha ki runga tata ki nga taputapu tahua, tae atu ki etahi taputapu papa-kotahi, he tino pai.

Ko nga mea e toe ana ko te ine i te tere o te tiki raraunga (tupurangi) me te arotake i te tauine i te wa e whakatauira ana i nga puranga rereke te rahi:

Tohu Paerewa

library(ggplot2)

set.seed(0)
# Подключение к базе данных
con <- DBI::dbConnect(MonetDBLite::MonetDBLite(), Sys.getenv("DBDIR"))

# Функция для подготовки запроса на стороне сервера
prep_sql <- function(batch_size) {
  sql <- sprintf("PREPARE SELECT id FROM doodles WHERE id IN (%s)",
                 paste(rep("?", batch_size), collapse = ","))
  res <- DBI::dbSendQuery(con, sql)
  return(res)
}

# Функция для извлечения данных
fetch_data <- function(rs, batch_size) {
  ids <- sample(seq_len(n), batch_size)
  res <- DBI::dbFetch(DBI::dbBind(rs, as.list(ids)))
  return(res)
}

# Проведение замера
res_bench <- bench::press(
  batch_size = 2^(4:10),
  {
    rs <- prep_sql(batch_size)
    bench::mark(
      fetch_data(rs, batch_size),
      min_iterations = 50L
    )
  }
)
# Параметры бенчмарка
cols <- c("batch_size", "min", "median", "max", "itr/sec", "total_time", "n_itr")
res_bench[, cols]

#   batch_size      min   median      max `itr/sec` total_time n_itr
#        <dbl> <bch:tm> <bch:tm> <bch:tm>     <dbl>   <bch:tm> <int>
# 1         16   23.6ms  54.02ms  93.43ms     18.8        2.6s    49
# 2         32     38ms  84.83ms 151.55ms     11.4       4.29s    49
# 3         64   63.3ms 175.54ms 248.94ms     5.85       8.54s    50
# 4        128   83.2ms 341.52ms 496.24ms     3.00      16.69s    50
# 5        256  232.8ms 653.21ms 847.44ms     1.58      31.66s    50
# 6        512  784.6ms    1.41s    1.98s     0.740       1.1m    49
# 7       1024  681.7ms    2.72s    4.06s     0.377      2.16m    49

ggplot(res_bench, aes(x = factor(batch_size), y = median, group = 1)) +
  geom_point() +
  geom_line() +
  ylab("median time, s") +
  theme_minimal()

DBI::dbDisconnect(con, shutdown = TRUE)

Te Whakaaetanga Doodle Tere: me pehea te whakahoahoa ki a R, C++ me nga whatunga neural

2. Te whakarite puranga

Ko te tukanga whakarite puranga katoa ko nga waahanga e whai ake nei:

  1. Werohia etahi JSON kei ​​roto nga vector aho me nga taunga o nga tohu.
  2. Te tuhi i nga raina tae i runga i nga taunga o nga tohu i runga i te ahua o te rahi e hiahiatia ana (hei tauira, 256x256 ranei 128x128).
  3. Te huri i nga whakaahua ka puta ki roto i te tensor.

Hei waahanga o te whakataetae i waenga i nga kakano Python, i whakatauhia te raru ma te whakamahi OpenCV. Ko tetahi o nga mea ngawari me te tino kitea i roto i te R ka penei te ahua:

Te whakatinana i te JSON ki te Tahuri Tensor i R

r_process_json_str <- function(json, line.width = 3, 
                               color = TRUE, scale = 1) {
  # Парсинг JSON
  coords <- jsonlite::fromJSON(json, simplifyMatrix = FALSE)
  tmp <- tempfile()
  # Удаляем временный файл по завершению функции
  on.exit(unlink(tmp))
  png(filename = tmp, width = 256 * scale, height = 256 * scale, pointsize = 1)
  # Пустой график
  plot.new()
  # Размер окна графика
  plot.window(xlim = c(256 * scale, 0), ylim = c(256 * scale, 0))
  # Цвета линий
  cols <- if (color) rainbow(length(coords)) else "#000000"
  for (i in seq_along(coords)) {
    lines(x = coords[[i]][[1]] * scale, y = coords[[i]][[2]] * scale, 
          col = cols[i], lwd = line.width)
  }
  dev.off()
  # Преобразование изображения в 3-х мерный массив
  res <- png::readPNG(tmp)
  return(res)
}

r_process_json_vector <- function(x, ...) {
  res <- lapply(x, r_process_json_str, ...)
  # Объединение 3-х мерных массивов картинок в 4-х мерный в тензор
  res <- do.call(abind::abind, c(res, along = 0))
  return(res)
}

Ka mahia te tuhi ma te whakamahi i nga taputapu R paerewa ka tiakina ki te PNG rangitahi kua rongoa ki te RAM (i runga i te Linux, kei te raarangi nga raarangi R rangitahi kei roto i te raarangi. /tmp, kua whakauruhia ki roto i te RAM). Ka panuihia tenei konae hei hurangi ahu-toru me nga tau mai i te 0 ki te 1. He mea nui tenei na te mea ka panuihia he BMP tikanga ake ki roto i te rarangi mata me nga waehere tae hex.

Me whakamatau te hua:

zip_file <- file.path("data", "train_simplified.zip")
csv_file <- "cat.csv"
unzip(zip_file, files = csv_file, exdir = tempdir(), 
      junkpaths = TRUE, unzip = getOption("unzip"))
tmp_data <- data.table::fread(file.path(tempdir(), csv_file), sep = ",", 
                              select = "drawing", nrows = 10000)
arr <- r_process_json_str(tmp_data[4, drawing])
dim(arr)
# [1] 256 256   3
plot(magick::image_read(arr))

Te Whakaaetanga Doodle Tere: me pehea te whakahoahoa ki a R, C++ me nga whatunga neural

Ko te puranga ake ka hanga penei:

res <- r_process_json_vector(tmp_data[1:4, drawing], scale = 0.5)
str(res)
 # num [1:4, 1:128, 1:128, 1:3] 1 1 1 1 1 1 1 1 1 1 ...
 # - attr(*, "dimnames")=List of 4
 #  ..$ : NULL
 #  ..$ : NULL
 #  ..$ : NULL
 #  ..$ : NULL

Ko tenei whakatinanatanga he ahua iti rawa ki a matou, na te mea he roa te hanga o nga roopu nui, a ka whakatau matou ki te whakamahi i nga wheako o o matou hoa mahi ma te whakamahi i tetahi whare pukapuka kaha. OpenCV. I taua wa karekau he kete kua oti te hanga mo R (kaore he mea inaianei), no reira he iti rawa te whakatinanatanga o nga mahi e hiahiatia ana i tuhia ki C++ me te whakauru ki te waehere R ma te whakamahi Rcpp.

Hei whakaoti rapanga, i whakamahia nga kohinga me nga whare pukapuka e whai ake nei:

  1. OpenCV mo te mahi me nga whakaahua me te tuhi raina. I whakamahia nga whare pukapuka punaha me nga konae pane, me te hono hihiri.

  2. xtensor mo te mahi me nga huinga maha me te tensors. I whakamahia e matou nga konae pane kei roto i te kete R o taua ingoa. Ko te whare pukapuka ka taea e koe te mahi me nga huinga ahu maha, i te rarangi matua me te raupapa matua pou.

  3. ndjson mo te whakamaarama i a JSON. Ka whakamahia tenei whare pukapuka ki xtensor aunoa mehemea kei roto i te kaupapa.

  4. RcppMiro mo te whakahaere i te tukatuka miro-maha o te vector mai i JSON. I whakamahia nga konae pane e whakaratohia ana e tenei kete. Mai i rongonui ake RcppParallel Ko te kete, i roto i era atu mea, he hanga-i roto i te tikanga aukati porotiti.

He mea pai kia kite i taua xtensor i puta he tohu atua: i tua atu i te meka he nui te mahi me te mahi nui, i puta mai he tino whakautu me te whakautu i nga paatai ​​​​i nga paatai. Na to raatau awhina, i taea te whakatinana i nga huringa o nga matrices OpenCV ki te xtensor tensors, me tetahi huarahi ki te whakakotahi i te ahua 3-ahu ki te tensor 4-ahu o te ahua tika (te puranga ake).

Nga rauemi hei ako i te Rcpp, xtensor me te RcppThread

https://thecoatlessprofessor.com/programming/unofficial-rcpp-api-documentation

https://docs.opencv.org/4.0.1/d7/dbd/group__imgproc.html

https://xtensor.readthedocs.io/en/latest/

https://xtensor.readthedocs.io/en/latest/file_loading.html#loading-json-data-into-xtensor

https://cran.r-project.org/web/packages/RcppThread/vignettes/RcppThread-vignette.pdf

Hei whakahiato i nga konae e whakamahi ana i nga konae punaha me te hono hihiri me nga whare pukapuka kua whakauruhia ki runga i te punaha, i whakamahia e matou te taputapu mono i whakauruhia ki roto i te kete. Rcpp. Hei kimi aunoa i nga huarahi me nga haki, i whakamahia e matou tetahi taputapu Linux rongonui pkg-whirihora.

Te whakatinanatanga o te mono Rcpp mo te whakamahi i te whare pukapuka OpenCV

Rcpp::registerPlugin("opencv", function() {
  # Возможные названия пакета
  pkg_config_name <- c("opencv", "opencv4")
  # Бинарный файл утилиты pkg-config
  pkg_config_bin <- Sys.which("pkg-config")
  # Проврека наличия утилиты в системе
  checkmate::assert_file_exists(pkg_config_bin, access = "x")
  # Проверка наличия файла настроек OpenCV для pkg-config
  check <- sapply(pkg_config_name, 
                  function(pkg) system(paste(pkg_config_bin, pkg)))
  if (all(check != 0)) {
    stop("OpenCV config for the pkg-config not found", call. = FALSE)
  }

  pkg_config_name <- pkg_config_name[check == 0]
  list(env = list(
    PKG_CXXFLAGS = system(paste(pkg_config_bin, "--cflags", pkg_config_name), 
                          intern = TRUE),
    PKG_LIBS = system(paste(pkg_config_bin, "--libs", pkg_config_name), 
                      intern = TRUE)
  ))
})

Ko te hua o te mahi a te mono, ka whakakapihia nga uara e whai ake nei i te wa o te mahi whakahiato:

Rcpp:::.plugins$opencv()$env

# $PKG_CXXFLAGS
# [1] "-I/usr/include/opencv"
#
# $PKG_LIBS
# [1] "-lopencv_shape -lopencv_stitching -lopencv_superres -lopencv_videostab -lopencv_aruco -lopencv_bgsegm -lopencv_bioinspired -lopencv_ccalib -lopencv_datasets -lopencv_dpm -lopencv_face -lopencv_freetype -lopencv_fuzzy -lopencv_hdf -lopencv_line_descriptor -lopencv_optflow -lopencv_video -lopencv_plot -lopencv_reg -lopencv_saliency -lopencv_stereo -lopencv_structured_light -lopencv_phase_unwrapping -lopencv_rgbd -lopencv_viz -lopencv_surface_matching -lopencv_text -lopencv_ximgproc -lopencv_calib3d -lopencv_features2d -lopencv_flann -lopencv_xobjdetect -lopencv_objdetect -lopencv_ml -lopencv_xphoto -lopencv_highgui -lopencv_videoio -lopencv_imgcodecs -lopencv_photo -lopencv_imgproc -lopencv_core"

Ko te waehere whakatinanatanga mo te panui JSON me te whakaputa i te puranga mo te tuku ki te tauira ka tukuna i raro i te kaipahua. Tuatahi, taapirihia he raarangi kaupapa a-rohe hei rapu i nga konae pane (e hiahiatia ana mo ndjson):

Sys.setenv("PKG_CXXFLAGS" = paste0("-I", normalizePath(file.path("src"))))

Te whakatinanatanga o JSON ki te hurihanga tensor i C++

// [[Rcpp::plugins(cpp14)]]
// [[Rcpp::plugins(opencv)]]
// [[Rcpp::depends(xtensor)]]
// [[Rcpp::depends(RcppThread)]]

#include <xtensor/xjson.hpp>
#include <xtensor/xadapt.hpp>
#include <xtensor/xview.hpp>
#include <xtensor-r/rtensor.hpp>
#include <opencv2/core/core.hpp>
#include <opencv2/highgui/highgui.hpp>
#include <opencv2/imgproc/imgproc.hpp>
#include <Rcpp.h>
#include <RcppThread.h>

// Синонимы для типов
using RcppThread::parallelFor;
using json = nlohmann::json;
using points = xt::xtensor<double,2>;     // Извлечённые из JSON координаты точек
using strokes = std::vector<points>;      // Извлечённые из JSON координаты точек
using xtensor3d = xt::xtensor<double, 3>; // Тензор для хранения матрицы изоображения
using xtensor4d = xt::xtensor<double, 4>; // Тензор для хранения множества изображений
using rtensor3d = xt::rtensor<double, 3>; // Обёртка для экспорта в R
using rtensor4d = xt::rtensor<double, 4>; // Обёртка для экспорта в R

// Статические константы
// Размер изображения в пикселях
const static int SIZE = 256;
// Тип линии
// См. https://en.wikipedia.org/wiki/Pixel_connectivity#2-dimensional
const static int LINE_TYPE = cv::LINE_4;
// Толщина линии в пикселях
const static int LINE_WIDTH = 3;
// Алгоритм ресайза
// https://docs.opencv.org/3.1.0/da/d54/group__imgproc__transform.html#ga5bb5a1fea74ea38e1a5445ca803ff121
const static int RESIZE_TYPE = cv::INTER_LINEAR;

// Шаблон для конвертирования OpenCV-матрицы в тензор
template <typename T, int NCH, typename XT=xt::xtensor<T,3,xt::layout_type::column_major>>
XT to_xt(const cv::Mat_<cv::Vec<T, NCH>>& src) {
  // Размерность целевого тензора
  std::vector<int> shape = {src.rows, src.cols, NCH};
  // Общее количество элементов в массиве
  size_t size = src.total() * NCH;
  // Преобразование cv::Mat в xt::xtensor
  XT res = xt::adapt((T*) src.data, size, xt::no_ownership(), shape);
  return res;
}

// Преобразование JSON в список координат точек
strokes parse_json(const std::string& x) {
  auto j = json::parse(x);
  // Результат парсинга должен быть массивом
  if (!j.is_array()) {
    throw std::runtime_error("'x' must be JSON array.");
  }
  strokes res;
  res.reserve(j.size());
  for (const auto& a: j) {
    // Каждый элемент массива должен быть 2-мерным массивом
    if (!a.is_array() || a.size() != 2) {
      throw std::runtime_error("'x' must include only 2d arrays.");
    }
    // Извлечение вектора точек
    auto p = a.get<points>();
    res.push_back(p);
  }
  return res;
}

// Отрисовка линий
// Цвета HSV
cv::Mat ocv_draw_lines(const strokes& x, bool color = true) {
  // Исходный тип матрицы
  auto stype = color ? CV_8UC3 : CV_8UC1;
  // Итоговый тип матрицы
  auto dtype = color ? CV_32FC3 : CV_32FC1;
  auto bg = color ? cv::Scalar(0, 0, 255) : cv::Scalar(255);
  auto col = color ? cv::Scalar(0, 255, 220) : cv::Scalar(0);
  cv::Mat img = cv::Mat(SIZE, SIZE, stype, bg);
  // Количество линий
  size_t n = x.size();
  for (const auto& s: x) {
    // Количество точек в линии
    size_t n_points = s.shape()[1];
    for (size_t i = 0; i < n_points - 1; ++i) {
      // Точка начала штриха
      cv::Point from(s(0, i), s(1, i));
      // Точка окончания штриха
      cv::Point to(s(0, i + 1), s(1, i + 1));
      // Отрисовка линии
      cv::line(img, from, to, col, LINE_WIDTH, LINE_TYPE);
    }
    if (color) {
      // Меняем цвет линии
      col[0] += 180 / n;
    }
  }
  if (color) {
    // Меняем цветовое представление на RGB
    cv::cvtColor(img, img, cv::COLOR_HSV2RGB);
  }
  // Меняем формат представления на float32 с диапазоном [0, 1]
  img.convertTo(img, dtype, 1 / 255.0);
  return img;
}

// Обработка JSON и получение тензора с данными изображения
xtensor3d process(const std::string& x, double scale = 1.0, bool color = true) {
  auto p = parse_json(x);
  auto img = ocv_draw_lines(p, color);
  if (scale != 1) {
    cv::Mat out;
    cv::resize(img, out, cv::Size(), scale, scale, RESIZE_TYPE);
    cv::swap(img, out);
    out.release();
  }
  xtensor3d arr = color ? to_xt<double,3>(img) : to_xt<double,1>(img);
  return arr;
}

// [[Rcpp::export]]
rtensor3d cpp_process_json_str(const std::string& x, 
                               double scale = 1.0, 
                               bool color = true) {
  xtensor3d res = process(x, scale, color);
  return res;
}

// [[Rcpp::export]]
rtensor4d cpp_process_json_vector(const std::vector<std::string>& x, 
                                  double scale = 1.0, 
                                  bool color = false) {
  size_t n = x.size();
  size_t dim = floor(SIZE * scale);
  size_t channels = color ? 3 : 1;
  xtensor4d res({n, dim, dim, channels});
  parallelFor(0, n, [&x, &res, scale, color](int i) {
    xtensor3d tmp = process(x[i], scale, color);
    auto view = xt::view(res, i, xt::all(), xt::all(), xt::all());
    view = tmp;
  });
  return res;
}

Me whakanoho tenei waehere ki roto i te konae src/cv_xt.cpp me te whakahiato me te whakahau Rcpp::sourceCpp(file = "src/cv_xt.cpp", env = .GlobalEnv); hiahiatia hoki mo te mahi nlohmann/json.hpp Tuhinga ka whai mai putunga. Kua wehea te waehere ki etahi mahi:

  • to_xt — he mahinga tauira mo te huri i te ahua matrix (cv::Mat) ki te tensor xt::xtensor;

  • parse_json — ka poroporoaki te mahi i te aho JSON, ka tango i nga taunga o nga tohu, ka whakakiia ki roto i te vector;

  • ocv_draw_lines — mai i te vector hua o nga tohu, ka kumea nga raina maha-tae;

  • process — ka whakakotahi i nga mahi o runga ake nei, me te taapiri i te kaha ki te tauine i te ahua ka puta;

  • cpp_process_json_str - takai i runga i te mahi process, e kaweake ana i te hua ki tetahi R-mea (te huinga maha);

  • cpp_process_json_vector - takai i runga i te mahi cpp_process_json_str, ka taea e koe te tukatuka i tetahi vector aho i roto i te aratau miro-maha.

Hei tuhi i nga raina maha-tae, i whakamahia te tauira tae HSV, whai muri i te hurihanga ki te RGB. Me whakamatau te hua:

arr <- cpp_process_json_str(tmp_data[4, drawing])
dim(arr)
# [1] 256 256   3
plot(magick::image_read(arr))

Te Whakaaetanga Doodle Tere: me pehea te whakahoahoa ki a R, C++ me nga whatunga neural
Te whakataurite i te tere o nga whakatinanatanga i R me C++

res_bench <- bench::mark(
  r_process_json_str(tmp_data[4, drawing], scale = 0.5),
  cpp_process_json_str(tmp_data[4, drawing], scale = 0.5),
  check = FALSE,
  min_iterations = 100
)
# Параметры бенчмарка
cols <- c("expression", "min", "median", "max", "itr/sec", "total_time", "n_itr")
res_bench[, cols]

#   expression                min     median       max `itr/sec` total_time  n_itr
#   <chr>                <bch:tm>   <bch:tm>  <bch:tm>     <dbl>   <bch:tm>  <int>
# 1 r_process_json_str     3.49ms     3.55ms    4.47ms      273.      490ms    134
# 2 cpp_process_json_str   1.94ms     2.02ms    5.32ms      489.      497ms    243

library(ggplot2)
# Проведение замера
res_bench <- bench::press(
  batch_size = 2^(4:10),
  {
    .data <- tmp_data[sample(seq_len(.N), batch_size), drawing]
    bench::mark(
      r_process_json_vector(.data, scale = 0.5),
      cpp_process_json_vector(.data,  scale = 0.5),
      min_iterations = 50,
      check = FALSE
    )
  }
)

res_bench[, cols]

#    expression   batch_size      min   median      max `itr/sec` total_time n_itr
#    <chr>             <dbl> <bch:tm> <bch:tm> <bch:tm>     <dbl>   <bch:tm> <int>
#  1 r                   16   50.61ms  53.34ms  54.82ms    19.1     471.13ms     9
#  2 cpp                 16    4.46ms   5.39ms   7.78ms   192.      474.09ms    91
#  3 r                   32   105.7ms 109.74ms 212.26ms     7.69        6.5s    50
#  4 cpp                 32    7.76ms  10.97ms  15.23ms    95.6     522.78ms    50
#  5 r                   64  211.41ms 226.18ms 332.65ms     3.85      12.99s    50
#  6 cpp                 64   25.09ms  27.34ms  32.04ms    36.0        1.39s    50
#  7 r                  128   534.5ms 627.92ms 659.08ms     1.61      31.03s    50
#  8 cpp                128   56.37ms  58.46ms  66.03ms    16.9        2.95s    50
#  9 r                  256     1.15s    1.18s    1.29s     0.851     58.78s    50
# 10 cpp                256  114.97ms 117.39ms 130.09ms     8.45       5.92s    50
# 11 r                  512     2.09s    2.15s    2.32s     0.463       1.8m    50
# 12 cpp                512  230.81ms  235.6ms 261.99ms     4.18      11.97s    50
# 13 r                 1024        4s    4.22s     4.4s     0.238       3.5m    50
# 14 cpp               1024  410.48ms 431.43ms 462.44ms     2.33      21.45s    50

ggplot(res_bench, aes(x = factor(batch_size), y = median, 
                      group =  expression, color = expression)) +
  geom_point() +
  geom_line() +
  ylab("median time, s") +
  theme_minimal() +
  scale_color_discrete(name = "", labels = c("cpp", "r")) +
  theme(legend.position = "bottom") 

Te Whakaaetanga Doodle Tere: me pehea te whakahoahoa ki a R, C++ me nga whatunga neural

Ka taea e koe te kite, he mea tino nui te piki tere, a kaore e taea te hopu i te waehere C ++ ma te whakarara i te waehere R.

3. He kaiwhakatikatika mo te tango i nga puranga mai i te putunga raraunga

He ingoa pai a R mo te tukatuka raraunga e uru ana ki te RAM, i te mea ko te Python te ahua ake o te tukatuka raraunga whiti, ka taea e koe te ngawari me te ngawari ki te whakatinana i nga tatauranga o-waho (te tatauranga ma te whakamahi mahara o waho). He tauira matarohia, whai take hoki mo tatou i roto i te horopaki o te rapanga kua whakaahuahia ko nga whatunga neural hohonu i whakangungua e te tikanga whakaheke rōnaki me te tata o te rōnaki i ia hikoinga ma te whakamahi i tetahi waahanga iti o nga tirohanga, he puranga paku ranei.

Ko nga anga ako hohonu kua tuhia ki te Python he karaehe motuhake e whakatinana ana i nga kaitoro i runga i nga raraunga: nga ripanga, nga pikitia kei roto i nga kōpaki, nga whakatakotoranga rua, me etahi atu. I roto i te R ka taea e tatou te whakamahi i nga ahuatanga katoa o te whare pukapuka Python pehi me ona momo tuara ma te whakamahi i te kete o taua ingoa, ka mahi ki runga ake o te kete whakahua. Ko te mea whakamutunga ka tika he tuhinga roa motuhake; ehara i te mea ka taea e koe te whakahaere i te waehere Python mai i te R, engari ka taea ano e koe te whakawhiti taonga i waenga i nga huihuinga R me te Python, me te mahi aunoa i nga huringa momo e tika ana.

I whakakorehia e matou te hiahia ki te rokiroki i nga raraunga katoa i roto i te RAM ma te whakamahi i te MonetDBite, ko nga mahi "whatunga neural" katoa ka mahia e te waehere taketake i roto i te Python, me tuhi noa e matou he kaiwhakaatu mo nga raraunga, na te mea kaore he mea kua rite. mo tenei ahuatanga i roto i te R, i te Python ranei. E rua noa nga whakaritenga mo taua mea: me whakahoki mai nga roopu ki roto i te kohanga mutunga kore ka tiakina tona ahuatanga i waenga i nga whitiwhitinga (ko te mea whakamutunga i roto i te R he mea ngawari ki te whakamahi katinga). I mua, i tika ki te huri i nga rarangi R ki roto i te taurangi, engari ko te putanga o naianei o te kete. pehi ka mahia e ia ano.

Ko te kaitoro mo te whakangungu me nga raraunga whakamana i puta ko enei e whai ake nei:

Kaitohu mo te whakangungu me te raraunga whakamana

train_generator <- function(db_connection = con,
                            samples_index,
                            num_classes = 340,
                            batch_size = 32,
                            scale = 1,
                            color = FALSE,
                            imagenet_preproc = FALSE) {
  # Проверка аргументов
  checkmate::assert_class(con, "DBIConnection")
  checkmate::assert_integerish(samples_index)
  checkmate::assert_count(num_classes)
  checkmate::assert_count(batch_size)
  checkmate::assert_number(scale, lower = 0.001, upper = 5)
  checkmate::assert_flag(color)
  checkmate::assert_flag(imagenet_preproc)

  # Перемешиваем, чтобы брать и удалять использованные индексы батчей по порядку
  dt <- data.table::data.table(id = sample(samples_index))
  # Проставляем номера батчей
  dt[, batch := (.I - 1L) %/% batch_size + 1L]
  # Оставляем только полные батчи и индексируем
  dt <- dt[, if (.N == batch_size) .SD, keyby = batch]
  # Устанавливаем счётчик
  i <- 1
  # Количество батчей
  max_i <- dt[, max(batch)]

  # Подготовка выражения для выгрузки
  sql <- sprintf(
    "PREPARE SELECT drawing, label_int FROM doodles WHERE id IN (%s)",
    paste(rep("?", batch_size), collapse = ",")
  )
  res <- DBI::dbSendQuery(con, sql)

  # Аналог keras::to_categorical
  to_categorical <- function(x, num) {
    n <- length(x)
    m <- numeric(n * num)
    m[x * n + seq_len(n)] <- 1
    dim(m) <- c(n, num)
    return(m)
  }

  # Замыкание
  function() {
    # Начинаем новую эпоху
    if (i > max_i) {
      dt[, id := sample(id)]
      data.table::setkey(dt, batch)
      # Сбрасываем счётчик
      i <<- 1
      max_i <<- dt[, max(batch)]
    }

    # ID для выгрузки данных
    batch_ind <- dt[batch == i, id]
    # Выгрузка данных
    batch <- DBI::dbFetch(DBI::dbBind(res, as.list(batch_ind)), n = -1)

    # Увеличиваем счётчик
    i <<- i + 1

    # Парсинг JSON и подготовка массива
    batch_x <- cpp_process_json_vector(batch$drawing, scale = scale, color = color)
    if (imagenet_preproc) {
      # Шкалирование c интервала [0, 1] на интервал [-1, 1]
      batch_x <- (batch_x - 0.5) * 2
    }

    batch_y <- to_categorical(batch$label_int, num_classes)
    result <- list(batch_x, batch_y)
    return(result)
  }
}

Ko te mahi he whakauru i tetahi taurangi me te hononga ki te papaa raraunga, nga tau o nga raina i whakamahia, te maha o nga karaehe, te rahi o te puranga, te tauine (scale = 1 e hāngai ana ki te whakaputa whakaahua 256x256 pika, scale = 0.5 — 128x128 pika), tohu tae (color = FALSE ka whakapūtā te tāhuahua ki te tauine hina ina whakamahia color = TRUE ka tuhia ia whiu ki tetahi tae hou) me tetahi tohu tukatuka mo nga whatunga kua whakangungua i mua i te imagenet. Ko te mea whakamutunga e hiahiatia ana hei tauine i nga uara pika mai i te waahi [0, 1] ki te waahi [-1, 1], i whakamahia i te wa e whakangungu ana i nga mea kua tukuna. pehi tauira.

Kei roto i te mahi o waho te arowhai momo tohenga, he ripanga data.table me nga tau raina matapōkeretia mai i samples_index me nga tau puranga, te kaute me te nui rawa o nga puranga, me te korero SQL mo te tango raraunga mai i te papaarangi. I tua atu, i tautuhia e matou he tere tere o te mahi o roto keras::to_categorical(). Tata ki te katoa nga raraunga i whakamahia e matou mo te whakangungu, ka waiho he haurua paiheneti mo te whakamana, na reira i whakawhäitihia te rahi o te wa e te tawhā. steps_per_epoch ina karangahia keras::fit_generator(), me te ahuatanga if (i > max_i) i mahi anake mo te kaitoro whakamana.

I roto i te mahi o roto, ka tikina nga tohu haupae mo te roopu e whai ake nei, ka tukuna nga rekoata mai i te paataka raraunga me te piki haere o te porotiti puranga, te whakamaarama JSON (mahi cpp_process_json_vector(), tuhia ki te C++) me te hanga i nga raupapa e rite ana ki nga pikitia. Katahi ka hangaia nga vectors kotahi-wera me nga tapanga karaehe, ko nga huinga me nga uara pika me nga tapanga ka whakauruhia ki roto i te rarangi, koinei te uara whakahoki. Hei tere ake te mahi, i whakamahia e matou te hanga tohu tohu ki nga ripanga data.table me te whakarereketanga ma te hono - me te kore o enei kete "kirikiri" raraunga.table He tino uaua ki te whakaaro ki te mahi pai me te nui o nga raraunga i R.

Ko nga hua o nga ine tere i runga i te pona Core i5 e whai ake nei:

Tohutohu Iterator

library(Rcpp)
library(keras)
library(ggplot2)

source("utils/rcpp.R")
source("utils/keras_iterator.R")

con <- DBI::dbConnect(drv = MonetDBLite::MonetDBLite(), Sys.getenv("DBDIR"))

ind <- seq_len(DBI::dbGetQuery(con, "SELECT count(*) FROM doodles")[[1L]])
num_classes <- DBI::dbGetQuery(con, "SELECT max(label_int) + 1 FROM doodles")[[1L]]

# Индексы для обучающей выборки
train_ind <- sample(ind, floor(length(ind) * 0.995))
# Индексы для проверочной выборки
val_ind <- ind[-train_ind]
rm(ind)
# Коэффициент масштаба
scale <- 0.5

# Проведение замера
res_bench <- bench::press(
  batch_size = 2^(4:10),
  {
    it1 <- train_generator(
      db_connection = con,
      samples_index = train_ind,
      num_classes = num_classes,
      batch_size = batch_size,
      scale = scale
    )
    bench::mark(
      it1(),
      min_iterations = 50L
    )
  }
)
# Параметры бенчмарка
cols <- c("batch_size", "min", "median", "max", "itr/sec", "total_time", "n_itr")
res_bench[, cols]

#   batch_size      min   median      max `itr/sec` total_time n_itr
#        <dbl> <bch:tm> <bch:tm> <bch:tm>     <dbl>   <bch:tm> <int>
# 1         16     25ms  64.36ms   92.2ms     15.9       3.09s    49
# 2         32   48.4ms 118.13ms 197.24ms     8.17       5.88s    48
# 3         64   69.3ms 117.93ms 181.14ms     8.57       5.83s    50
# 4        128  157.2ms 240.74ms 503.87ms     3.85      12.71s    49
# 5        256  359.3ms 613.52ms 988.73ms     1.54       30.5s    47
# 6        512  884.7ms    1.53s    2.07s     0.674      1.11m    45
# 7       1024     2.7s    3.83s    5.47s     0.261      2.81m    44

ggplot(res_bench, aes(x = factor(batch_size), y = median, group = 1)) +
    geom_point() +
    geom_line() +
    ylab("median time, s") +
    theme_minimal()

DBI::dbDisconnect(con, shutdown = TRUE)

Te Whakaaetanga Doodle Tere: me pehea te whakahoahoa ki a R, C++ me nga whatunga neural

Mena kei a koe te nui o te RAM, ka taea e koe te tere ake te mahi o te paataka ma te whakawhiti ki tenei RAM ano (32 GB he nui mo ta maatau mahi). I roto i te Linux, ka whakauruhia te wehewehenga ma te taunoa /dev/shm, e noho ana ki te haurua o te kaha RAM. Ka taea e koe te whakanui ake ma te whakatika /etc/fstabki te tiki rekoata penei tmpfs /dev/shm tmpfs defaults,size=25g 0 0. Me whakaara ano me te tirotiro i te hua ma te whakahaere i te whakahau df -h.

He maamaa ake te ahua o te kaitoro mo nga raraunga whakamatautau, na te mea ka uru katoa te huinga raraunga whakamatautau ki te RAM:

Kaihuri mo nga raraunga whakamatautau

test_generator <- function(dt,
                           batch_size = 32,
                           scale = 1,
                           color = FALSE,
                           imagenet_preproc = FALSE) {

  # Проверка аргументов
  checkmate::assert_data_table(dt)
  checkmate::assert_count(batch_size)
  checkmate::assert_number(scale, lower = 0.001, upper = 5)
  checkmate::assert_flag(color)
  checkmate::assert_flag(imagenet_preproc)

  # Проставляем номера батчей
  dt[, batch := (.I - 1L) %/% batch_size + 1L]
  data.table::setkey(dt, batch)
  i <- 1
  max_i <- dt[, max(batch)]

  # Замыкание
  function() {
    batch_x <- cpp_process_json_vector(dt[batch == i, drawing], 
                                       scale = scale, color = color)
    if (imagenet_preproc) {
      # Шкалирование c интервала [0, 1] на интервал [-1, 1]
      batch_x <- (batch_x - 0.5) * 2
    }
    result <- list(batch_x)
    i <<- i + 1
    return(result)
  }
}

4. Te whiriwhiringa o te hoahoanga tauira

Ko te hoahoanga tuatahi i whakamahia mobilenet v1, ko nga ahuatanga e korerohia ana i roto i tenei karere. Kua whakauruhia hei paerewa pehi a, no reira, kei te waatea i roto i te kete o taua ingoa mo R. Engari i te wa e ngana ana ki te whakamahi me nga whakaahua hongere-kotahi, ka puta he mea ke: me whai i nga wa katoa te rahi o te tensor whakauru. (batch, height, width, 3), ara, ko te maha o nga hongere kaore e taea te whakarereke. Karekau he herenga i roto i te Python, no reira ka tere matou ki te tuhi i ta matou ake whakatinanatanga o tenei hoahoanga, whai muri i te tuhinga taketake (kaore he whakahekenga kei roto i te putanga pakeke):

Hangahanga Mobilenet v1

library(keras)

top_3_categorical_accuracy <- custom_metric(
    name = "top_3_categorical_accuracy",
    metric_fn = function(y_true, y_pred) {
         metric_top_k_categorical_accuracy(y_true, y_pred, k = 3)
    }
)

layer_sep_conv_bn <- function(object, 
                              filters,
                              alpha = 1,
                              depth_multiplier = 1,
                              strides = c(2, 2)) {

  # NB! depth_multiplier !=  resolution multiplier
  # https://github.com/keras-team/keras/issues/10349

  layer_depthwise_conv_2d(
    object = object,
    kernel_size = c(3, 3), 
    strides = strides,
    padding = "same",
    depth_multiplier = depth_multiplier
  ) %>%
  layer_batch_normalization() %>% 
  layer_activation_relu() %>%
  layer_conv_2d(
    filters = filters * alpha,
    kernel_size = c(1, 1), 
    strides = c(1, 1)
  ) %>%
  layer_batch_normalization() %>% 
  layer_activation_relu() 
}

get_mobilenet_v1 <- function(input_shape = c(224, 224, 1),
                             num_classes = 340,
                             alpha = 1,
                             depth_multiplier = 1,
                             optimizer = optimizer_adam(lr = 0.002),
                             loss = "categorical_crossentropy",
                             metrics = c("categorical_crossentropy",
                                         top_3_categorical_accuracy)) {

  inputs <- layer_input(shape = input_shape)

  outputs <- inputs %>%
    layer_conv_2d(filters = 32, kernel_size = c(3, 3), strides = c(2, 2), padding = "same") %>%
    layer_batch_normalization() %>% 
    layer_activation_relu() %>%
    layer_sep_conv_bn(filters = 64, strides = c(1, 1)) %>%
    layer_sep_conv_bn(filters = 128, strides = c(2, 2)) %>%
    layer_sep_conv_bn(filters = 128, strides = c(1, 1)) %>%
    layer_sep_conv_bn(filters = 256, strides = c(2, 2)) %>%
    layer_sep_conv_bn(filters = 256, strides = c(1, 1)) %>%
    layer_sep_conv_bn(filters = 512, strides = c(2, 2)) %>%
    layer_sep_conv_bn(filters = 512, strides = c(1, 1)) %>%
    layer_sep_conv_bn(filters = 512, strides = c(1, 1)) %>%
    layer_sep_conv_bn(filters = 512, strides = c(1, 1)) %>%
    layer_sep_conv_bn(filters = 512, strides = c(1, 1)) %>%
    layer_sep_conv_bn(filters = 512, strides = c(1, 1)) %>%
    layer_sep_conv_bn(filters = 1024, strides = c(2, 2)) %>%
    layer_sep_conv_bn(filters = 1024, strides = c(1, 1)) %>%
    layer_global_average_pooling_2d() %>%
    layer_dense(units = num_classes) %>%
    layer_activation_softmax()

    model <- keras_model(
      inputs = inputs,
      outputs = outputs
    )

    model %>% compile(
      optimizer = optimizer,
      loss = loss,
      metrics = metrics
    )

    return(model)
}

Ka kitea nga huakore o tenei huarahi. Kei te pirangi au ki te whakamatau i nga tauira maha, engari, kaore au e pai ki te tuhi ano i ia hoahoanga. I whakakorehia ano e matou te whai waahi ki te whakamahi i nga taumahatanga o nga tauira kua whakangunguhia i mua i te imagenet. Ka rite ki o mua, i awhina te ako i nga tuhinga. Mahi get_config() ka taea e koe te tiki whakaahuatanga o te tauira i roto i te ahua e tika ana mo te whakatika (base_model_conf$layers - he rarangi R auau), me te mahi from_config() ka mahia te hurihanga whakamuri ki tetahi ahanoa tauira:

base_model_conf <- get_config(base_model)
base_model_conf$layers[[1]]$config$batch_input_shape[[4]] <- 1L
base_model <- from_config(base_model_conf)

Inaianei ehara i te mea uaua ki te tuhi i tetahi mahi mo te ao ki te whiwhi i tetahi o nga mea kua tukuna pehi nga tauira me nga taumahatanga kore ranei kua whakangungua ki te kupenga whakaahua:

Mahi mo te uta i nga hoahoanga kua rite

get_model <- function(name = "mobilenet_v2",
                      input_shape = NULL,
                      weights = "imagenet",
                      pooling = "avg",
                      num_classes = NULL,
                      optimizer = keras::optimizer_adam(lr = 0.002),
                      loss = "categorical_crossentropy",
                      metrics = NULL,
                      color = TRUE,
                      compile = FALSE) {
  # Проверка аргументов
  checkmate::assert_string(name)
  checkmate::assert_integerish(input_shape, lower = 1, upper = 256, len = 3)
  checkmate::assert_count(num_classes)
  checkmate::assert_flag(color)
  checkmate::assert_flag(compile)

  # Получаем объект из пакета keras
  model_fun <- get0(paste0("application_", name), envir = asNamespace("keras"))
  # Проверка наличия объекта в пакете
  if (is.null(model_fun)) {
    stop("Model ", shQuote(name), " not found.", call. = FALSE)
  }

  base_model <- model_fun(
    input_shape = input_shape,
    include_top = FALSE,
    weights = weights,
    pooling = pooling
  )

  # Если изображение не цветное, меняем размерность входа
  if (!color) {
    base_model_conf <- keras::get_config(base_model)
    base_model_conf$layers[[1]]$config$batch_input_shape[[4]] <- 1L
    base_model <- keras::from_config(base_model_conf)
  }

  predictions <- keras::get_layer(base_model, "global_average_pooling2d_1")$output
  predictions <- keras::layer_dense(predictions, units = num_classes, activation = "softmax")
  model <- keras::keras_model(
    inputs = base_model$input,
    outputs = predictions
  )

  if (compile) {
    keras::compile(
      object = model,
      optimizer = optimizer,
      loss = loss,
      metrics = metrics
    )
  }

  return(model)
}

I te wa e whakamahi ana i nga whakaahua hongere-kotahi, karekau he taumaha kua oti te whakangungu. Ka taea tenei te whakatika: ma te whakamahi i te mahi get_weights() tikina nga taumahatanga tauira ki te ahua o te rarangi o nga raupapa R, huri i te inenga o te huānga tuatahi o tenei rarangi (ma te tango i te hongere tae kotahi, te utu toharite toru ranei), katahi ka utaina nga taumaha ki roto i te tauira me te mahi. set_weights(). Kaore matou i taapiri i tenei mahi, na te mea i tenei waahanga kua tino marama ake he pai ake te mahi me nga pikitia tae.

I mahia e matou te nuinga o nga whakamatautau ma te whakamahi i nga putanga mobilenet 1 me te 2, me te resnet34. He pai ake nga hoahoanga hou penei i te SE-ResNeXt i mahi pai i tenei whakataetae. Ko te mea pouri, karekau he whakatinanatanga kua rite i a matou, a kaore matou i tuhi i a matou ake (engari ka tino tuhi matou).

5. Te tawhā o ngā hōtuhi

Mo te waatea, ko nga waehere katoa mo te tiimata whakangungu i hangaia hei tuhinga kotahi, kua tohua ma te whakamahi tuhinga e whai ake nei:

doc <- '
Usage:
  train_nn.R --help
  train_nn.R --list-models
  train_nn.R [options]

Options:
  -h --help                   Show this message.
  -l --list-models            List available models.
  -m --model=<model>          Neural network model name [default: mobilenet_v2].
  -b --batch-size=<size>      Batch size [default: 32].
  -s --scale-factor=<ratio>   Scale factor [default: 0.5].
  -c --color                  Use color lines [default: FALSE].
  -d --db-dir=<path>          Path to database directory [default: Sys.getenv("db_dir")].
  -r --validate-ratio=<ratio> Validate sample ratio [default: 0.995].
  -n --n-gpu=<number>         Number of GPUs [default: 1].
'
args <- docopt::docopt(doc)

Tuhinga tuhinga e tohu ana i te whakatinanatanga http://docopt.org/ mo R. Ma tana awhina, ka whakarewahia nga tuhinga me nga whakahau ngawari penei Rscript bin/train_nn.R -m resnet50 -c -d /home/andrey/doodle_db ranei ./bin/train_nn.R -m resnet50 -c -d /home/andrey/doodle_db, ki te kōnae train_nn.R ka taea te whakahaere (ka timata tenei whakahau ki te whakangungu i te tauira resnet50 i runga i nga whakaahua e toru-tae te ine i te 128x128 pika, me noho te patengi raraunga ki te kōpaki /home/andrey/doodle_db). Ka taea e koe te taapiri i te tere ako, te momo kaihoroi, me etahi atu tawhā whakarite ki te raarangi. I roto i te whakaritenga o te whakaputanga, ka puta ko te hoahoanga mobilenet_v2 mai i te putanga o naianei pehi i te whakamahi R kaore e taea na nga huringa kaore i whakaarohia i roto i te kete R, kei te tatari matou kia whakatikahia e ratou.

Na tenei huarahi i taea ai te tere tere o nga whakamatautau me nga tauira rereke ka whakatauritea ki te whakarewatanga tuku iho o nga tuhinga tuhi i RStudio (ka kite matou ko te kete he momo rereke ka taea. tfruns). Engari ko te painga nui ko te kaha ki te whakahaere ngawari i te whakarewatanga o nga tuhinga i Docker, i runga noa ranei i te tūmau, me te kore e whakauru i te RStudio mo tenei.

6. Dockerization o hōtuhi

I whakamahia e matou a Docker hei whakarite i te kawe o te taiao mo nga tauira whakangungu i waenga i nga mema o te roopu me te tuku tere ki te kapua. Ka taea e koe te timata ki te mohio ki tenei taputapu, he mea rereke mo te kaiwhakaputa R, me tenei raupapa pukapuka ranei akoranga ataata.

Ka taea e Docker te hanga i a koe ake whakaahua mai i te pakaru me te whakamahi i etahi atu whakaahua hei kaupapa mo te hanga i a koe ake. I te wetewete i nga whiringa e waatea ana, ka whakatau matou ko te whakauru i nga taraiwa NVIDIA, CUDA + cuDNN me nga whare pukapuka Python he waahanga tino nui o te ahua, a ka whakatau matou ki te tango i te ahua mana hei kaupapa. tensorflow/tensorflow:1.12.0-gpu, me te taapiri i nga kohinga R e tika ana ki reira.

He penei te ahua o te konae docker whakamutunga:

Dockerfile

FROM tensorflow/tensorflow:1.12.0-gpu

MAINTAINER Artem Klevtsov <[email protected]>

SHELL ["/bin/bash", "-c"]

ARG LOCALE="en_US.UTF-8"
ARG APT_PKG="libopencv-dev r-base r-base-dev littler"
ARG R_BIN_PKG="futile.logger checkmate data.table rcpp rapidjsonr dbi keras jsonlite curl digest remotes"
ARG R_SRC_PKG="xtensor RcppThread docopt MonetDBLite"
ARG PY_PIP_PKG="keras"
ARG DIRS="/db /app /app/data /app/models /app/logs"

RUN source /etc/os-release && 
    echo "deb https://cloud.r-project.org/bin/linux/ubuntu ${UBUNTU_CODENAME}-cran35/" > /etc/apt/sources.list.d/cran35.list && 
    apt-key adv --keyserver keyserver.ubuntu.com --recv-keys E084DAB9 && 
    add-apt-repository -y ppa:marutter/c2d4u3.5 && 
    add-apt-repository -y ppa:timsc/opencv-3.4 && 
    apt-get update && 
    apt-get install -y locales && 
    locale-gen ${LOCALE} && 
    apt-get install -y --no-install-recommends ${APT_PKG} && 
    ln -s /usr/lib/R/site-library/littler/examples/install.r /usr/local/bin/install.r && 
    ln -s /usr/lib/R/site-library/littler/examples/install2.r /usr/local/bin/install2.r && 
    ln -s /usr/lib/R/site-library/littler/examples/installGithub.r /usr/local/bin/installGithub.r && 
    echo 'options(Ncpus = parallel::detectCores())' >> /etc/R/Rprofile.site && 
    echo 'options(repos = c(CRAN = "https://cloud.r-project.org"))' >> /etc/R/Rprofile.site && 
    apt-get install -y $(printf "r-cran-%s " ${R_BIN_PKG}) && 
    install.r ${R_SRC_PKG} && 
    pip install ${PY_PIP_PKG} && 
    mkdir -p ${DIRS} && 
    chmod 777 ${DIRS} && 
    rm -rf /tmp/downloaded_packages/ /tmp/*.rds && 
    rm -rf /var/lib/apt/lists/*

COPY utils /app/utils
COPY src /app/src
COPY tests /app/tests
COPY bin/*.R /app/

ENV DBDIR="/db"
ENV CUDA_HOME="/usr/local/cuda"
ENV PATH="/app:${PATH}"

WORKDIR /app

VOLUME /db
VOLUME /app

CMD bash

Mo te pai, ka whakauruhia nga kete i whakamahia ki nga taurangi; ko te nuinga o nga tuhinga tuhi ka kape ki roto i nga ipu i te wa e huihui ana. I hurihia ano e matou te anga whakahau ki /bin/bash mo te ngawari o te whakamahi i nga ihirangi /etc/os-release. Na tenei i karo i te hiahia ki te tautuhi i te putanga OS i roto i te waehere.

I tua atu, i tuhia he tuhinga bash iti e taea ai e koe te whakarewa i tetahi ipu me nga momo whakahau. Hei tauira, he tuhinga hei whakangungu i nga whatunga neural i tuu i mua ki roto i te ipu, he anga whakahau ranei mo te patuiro me te aro turuki i te mahi o te ipu:

Tuhituhi hei whakarewa i te ipu

#!/bin/sh

DBDIR=${PWD}/db
LOGSDIR=${PWD}/logs
MODELDIR=${PWD}/models
DATADIR=${PWD}/data
ARGS="--runtime=nvidia --rm -v ${DBDIR}:/db -v ${LOGSDIR}:/app/logs -v ${MODELDIR}:/app/models -v ${DATADIR}:/app/data"

if [ -z "$1" ]; then
    CMD="Rscript /app/train_nn.R"
elif [ "$1" = "bash" ]; then
    ARGS="${ARGS} -ti"
else
    CMD="Rscript /app/train_nn.R $@"
fi

docker run ${ARGS} doodles-tf ${CMD}

Mena ka whakahaeretia tenei tuhinga bash kaore he tawhā, ka karangahia te tuhinga ki roto i te ipu train_nn.R me nga uara taunoa; Mena ko te "bash" te tohenga tuunga tuatahi, katahi ka tiimata te pahekoheko me te anga whakahau. I era atu keehi katoa, ka whakakapihia nga uara o nga tohenga tuunga: CMD="Rscript /app/train_nn.R $@".

Me mahara ko nga raarangi me nga raraunga puna me te papaaarangi, me te raarangi mo te penapena tauira kua whakangungua, kei roto i te ipu mai i te punaha manaaki, e taea ai e koe te uru atu ki nga hua o nga tuhinga me te kore he raweke.

7. Te whakamahi i nga GPU maha i runga i te Google Cloud

Ko tetahi o nga ahuatanga o te whakataetae ko nga raraunga tino ngangau (tirohia te pikitia taitara, i tonohia mai i @Leigh.plt mai i ODS slack). Ka awhina nga roopu nui ki te whawhai i tenei, a, i muri i nga whakamatautau i runga i te PC me te 1 GPU, i whakatau matou ki te mohio ki nga tauira whakangungu mo etahi GPU i te kapua. Kua whakamahia a GoogleCloud (he aratohu pai ki nga kaupapa) na te nui o nga whiriwhiringa e waatea ana, nga utu whaitake me te $300 bonus. Na te apo, ka whakahaua e ahau he tauira 4xV100 me te SSD me te tone RAM, a he he nui tera. Ko taua miihini ka kai tere te moni; ka taea e koe te pakaru ki te whakamatau me te kore he paipa paipa. Mo nga kaupapa matauranga, he pai ake te tango i te K80. Engari ko te nui o te RAM i whai hua - kaore te kapua SSD i tino pai ki ana mahi, na reira i whakawhitia te papaarangi ki dev/shm.

Ko te tino painga ko te wahanga waehere te kawenga mo te whakamahi i nga GPU maha. Tuatahi, ka hangaia te tauira i runga i te PTM ma te whakamahi i te kaiwhakahaere horopaki, pera ano i te Python:

with(tensorflow::tf$device("/cpu:0"), {
  model_cpu <- get_model(
    name = model_name,
    input_shape = input_shape,
    weights = weights,
    metrics =(top_3_categorical_accuracy,
    compile = FALSE
  )
})

Na ka kapehia te tauira kaore i whakahiato (he mea nui tenei) ki te maha o nga GPU e waatea ana, a muri iho ka whakahiatohia:

model <- keras::multi_gpu_model(model_cpu, gpus = n_gpu)
keras::compile(
  object = model,
  optimizer = keras::optimizer_adam(lr = 0.0004),
  loss = "categorical_crossentropy",
  metrics = c(top_3_categorical_accuracy)
)

Ko te tikanga matarohia o te whakatio i nga paparanga katoa engari ko te mea whakamutunga, ko te whakangungu i te paparanga whakamutunga, ko te wetewete me te whakangungu ano i te tauira katoa mo etahi GPU kaore i taea te whakatinana.

I aroturukihia te whakangungu me te kore whakamahi. papa hau, te whakawhāiti i a mātou ki te tuhi rākau me te penapena tauira me nga ingoa whakamohio i muri i ia wa:

Waea hoki

# Шаблон имени файла лога
log_file_tmpl <- file.path("logs", sprintf(
  "%s_%d_%dch_%s.csv",
  model_name,
  dim_size,
  channels,
  format(Sys.time(), "%Y%m%d%H%M%OS")
))
# Шаблон имени файла модели
model_file_tmpl <- file.path("models", sprintf(
  "%s_%d_%dch_{epoch:02d}_{val_loss:.2f}.h5",
  model_name,
  dim_size,
  channels
))

callbacks_list <- list(
  keras::callback_csv_logger(
    filename = log_file_tmpl
  ),
  keras::callback_early_stopping(
    monitor = "val_loss",
    min_delta = 1e-4,
    patience = 8,
    verbose = 1,
    mode = "min"
  ),
  keras::callback_reduce_lr_on_plateau(
    monitor = "val_loss",
    factor = 0.5, # уменьшаем lr в 2 раза
    patience = 4,
    verbose = 1,
    min_delta = 1e-4,
    mode = "min"
  ),
  keras::callback_model_checkpoint(
    filepath = model_file_tmpl,
    monitor = "val_loss",
    save_best_only = FALSE,
    save_weights_only = FALSE,
    mode = "min"
  )
)

8. Engari he whakatau

He maha nga raruraru kua pa ki a matou kaore ano kia tutuki:

  • в pehi karekau he mahi kua rite mo te rapu aunoa mo te reiti ako tino pai (analogue lr_finder i te whare pukapuka tere.ai); Ma te kaha, ka taea te kawe i nga whakatinanatanga tuatoru ki R, hei tauira, tenei;
  • na te mea o mua, kaore i taea te kowhiri i te tere whakangungu tika i te wa e whakamahi ana i nga GPU maha;
  • he iti noa nga hoahoanga whatunga neural hou, ina koa ko nga mea kua whakangungua i mua i te imagenet;
  • karekau he kaupapa here huri noa me te reiti ako whakahāwea (ko te whakahiatotanga o te cosine i runga i ta matou tono whakatinana, Tēnā koe skeydan).

He aha nga mea whai hua i ako mai i tenei whakataetae:

  • I runga i nga taputapu iti-hiko, ka taea e koe te mahi me te tika (he maha nga wa te rahi o te RAM) o nga kohinga raraunga kaore he mamae. Puke kirihou raraunga.table ka whakaora mahara na te whakarereketanga o nga ripanga i roto i te waahi, ka karo i te kape, a ka whakamahia tika, ko ona kaha e whakaatu ana i nga wa katoa te tere teitei o nga taputapu katoa e mohiotia ana e matou mo nga reo tuhi. Ko te tiaki i nga raraunga i roto i te papanga raraunga ka taea e koe, i roto i te maha o nga keehi, kia kaua e whakaaro mo te hiahia ki te kohi i nga huinga raraunga katoa ki roto i te RAM.
  • Ko nga mahi puhoi i roto i te R ka taea te whakakapi ki nga mahi tere i C++ ma te whakamahi i te kete Rcpp. Mena i tua atu ki te whakamahi RcppMiro ranei RcppParallel, ka whiwhi tatou i nga whakatinanatanga maha-miro-whakawhitiwhiti, na reira kaore he take ki te whakarara i te waehere i te taumata R.
  • Mōkī Rcpp Ka taea te whakamahi me te kore e tino mohio ki te C ++, kua tohua te iti rawa konei. Ko nga konae pane mo te maha o nga whare pukapuka-C hauhautanga penei xtensor Kei te waatea i runga i te CRAN, ara, kei te hangaia he hanganga mo te whakatinanatanga o nga kaupapa e whakauru ana i te waehere C++ mahi teitei kua rite ki roto ki te R. Ko etahi atu watea ko te tohu wetereo me te kaitirotiro waehere C++ pateko i RStudio.
  • tuhinga ka taea e koe te whakahaere i nga tuhinga whaiaro me nga tawhā. He watea tenei mo te whakamahi i runga i te tūmau mamao, incl. raro docker. I roto i te RStudio, he uaua ki te whakahaere i nga haora maha o nga whakamatautau me te whakangungu i nga whatunga neural, me te whakauru i te IDE ki runga i te tūmau kaore i te tika i nga wa katoa.
  • Ka whakarite a Docker i te kawe waehere me te whakahoki ano i nga hua i waenga i nga kaiwhakawhanake me nga momo momo OS me nga whare pukapuka, me te ngawari o te mahi ki runga i nga kaitoro. Ka taea e koe te whakarewa i te paipa whakangungu katoa me te whakahau kotahi.
  • He huarahi pai a Google Cloud ki te whakamatau i nga taputapu utu nui, engari me ata whiriwhiri koe i nga whirihoranga.
  • He tino whai hua te ine i te tere o nga wahanga waehere takitahi, ina koa ka whakakotahihia te R me te C++, me te kete. Tuhinga - he tino ngawari hoki.

I te nuinga o te wa i tino whai hua tenei wheako, a ka mahi tonu matou ki te whakatau i etahi o nga take i whakaarahia.

Source: will.com

Tāpiri i te kōrero