estimation of sampling errors - перевод на русский
Diclib.com
Словарь ChatGPT
Введите слово или словосочетание на любом языке 👆
Язык:

Перевод и анализ слов искусственным интеллектом ChatGPT

На этой странице Вы можете получить подробный анализ слова или словосочетания, произведенный с помощью лучшей на сегодняшний день технологии искусственного интеллекта:

  • как употребляется слово
  • частота употребления
  • используется оно чаще в устной или письменной речи
  • варианты перевода слова
  • примеры употребления (несколько фраз с переводом)
  • этимология

estimation of sampling errors - перевод на русский

Estimation of Distribution Algorithm; Estimation of Distribution Algorithms; PMBGA
Найдено результатов: 51695
estimation of sampling errors      
оценивание ошибок выборочного обследования
estimation of sampling errors      

статистика

оценивание ошибок выборочного обследования

estimation of sampling errors      
оценивание ошибок выборочного обследования
snowball sampling         
NONPROBABILITY SAMPLING TECHNIQUE
Snowball sample; Respondent-driven sampling; Snowball method; Snowballed sample
выборка типа "снежный ком"; эмпирическая выборка, не имеющая вероятностного обоснования, формируется, когда трудно очертить границы генеральной совокупности путем целенаправленного отбора экспертов и так называемых "редких элементов", которые после интервью могут указать следующий элемент и так далее (используется при изучении закрытых социальных групп - религиозных сект, банд и т.п.).
sampling theorem         
  • the sampled sequences are identical}}, even though the original continuous pre-sampled functions are not. If these were audio signals, <math>x(t)</math> and <math>x_A(t)</math> might not sound the same. But their samples (taken at rate ''f''<sub>s</sub>) are identical and would lead to identical reproduced sounds; thus ''x''<sub>A</sub>(''t'') is an alias of ''x''(''t'') at this sample rate.
  • The samples of two sine waves can be identical when at least one of them is at a frequency above half the sample rate.
  • A family of sinusoids at the critical frequency, all having the same sample sequences of alternating +1 and –1. That is, they all are aliases of each other, even though their frequency is not above half the sample rate.
  • Properly sampled image
  • Subsampled image showing a [[Moiré pattern]]
  • The figure on the left shows a function (in gray/black) being sampled and reconstructed (in gold) at steadily increasing sample-densities, while the figure on the right shows the frequency spectrum of the gray/black function, which does not change. The highest frequency in the spectrum is ½ the width of the entire spectrum. The width of the steadily-increasing pink shading is equal to the sample-rate. When it encompasses the entire frequency spectrum it is twice as large as the highest frequency, and that is when the reconstructed waveform matches the sampled one.
  • Spectrum, ''X<sub>s</sub>''(''f''), of a properly sampled bandlimited signal (blue) and the adjacent DTFT images (green) that do not overlap. A ''brick-wall'' low-pass filter, ''H''(''f''), removes the images, leaves the original spectrum, ''X''(''f''), and recovers the original signal from its samples.
  • x}}.
THEOREM
Nyquist theorem; Shannon sampling theorem; Nyquist sampling theorem; Nyquist's theorem; Shannon-Nyquist sampling theorem; Nyquist-Shannon Sampling Theorem; Nyqvist-Shannon sampling theorem; Sampling theorem; Nyquist Sampling Theorem; Nyquist-Shannon sampling theorem; Nyquist–Shannon theorem; Nyquist–Shannon Theorem; Nyquist Theorem; Shannon-Nyquist theorem; Nyquist sampling; Nyquist's law; Nyquist law; Coherent sampling; Nyqvist limit; Raabe condition; Nyquist-Shannon Theorem; Nyquist-Shannon theorem; Nyquist noise theorem; Shannon–Nyquist theorem; Kotelnikov-Shannon theorem; Kotelnikov–Shannon theorem; Nyquist-Shannon; Kotelnikov theorem; Nyquist's sampling theorem; Sampling Theorem; Nyquist Shannon theorem; Nyquist–Shannon–Kotelnikov sampling theorem; Whittaker–Shannon–Kotelnikov sampling theorem; Whittaker–Nyquist–Kotelnikov–Shannon sampling theorem; Nyquist-Shannon-Kotelnikov sampling theorem; Whittaker-Shannon-Kotelnikov sampling theorem; Whittaker-Nyquist-Kotelnikov-Shannon sampling theorem; Cardinal theorem of interpolation; WKS sampling theorem; Whittaker–Kotelnikow–Shannon sampling theorem; Whittaker-Kotelnikow-Shannon sampling theorem; Nyquist–Shannon–Kotelnikov; Whittaker–Shannon–Kotelnikov; Whittaker–Nyquist–Kotelnikov–Shannon; Nyquist-Shannon-Kotelnikov; Whittaker-Shannon-Kotelnikov; Whittaker-Nyquist-Kotelnikov-Shannon; Whittaker–Shannon sampling theorem; Whittaker–Nyquist–Shannon sampling theorem; Whittaker-Nyquist-Shannon sampling theorem; Whittaker-Shannon sampling theorem
теорема отсчетов (дискретазации)
sampling theorem         
  • the sampled sequences are identical}}, even though the original continuous pre-sampled functions are not. If these were audio signals, <math>x(t)</math> and <math>x_A(t)</math> might not sound the same. But their samples (taken at rate ''f''<sub>s</sub>) are identical and would lead to identical reproduced sounds; thus ''x''<sub>A</sub>(''t'') is an alias of ''x''(''t'') at this sample rate.
  • The samples of two sine waves can be identical when at least one of them is at a frequency above half the sample rate.
  • A family of sinusoids at the critical frequency, all having the same sample sequences of alternating +1 and –1. That is, they all are aliases of each other, even though their frequency is not above half the sample rate.
  • Properly sampled image
  • Subsampled image showing a [[Moiré pattern]]
  • The figure on the left shows a function (in gray/black) being sampled and reconstructed (in gold) at steadily increasing sample-densities, while the figure on the right shows the frequency spectrum of the gray/black function, which does not change. The highest frequency in the spectrum is ½ the width of the entire spectrum. The width of the steadily-increasing pink shading is equal to the sample-rate. When it encompasses the entire frequency spectrum it is twice as large as the highest frequency, and that is when the reconstructed waveform matches the sampled one.
  • Spectrum, ''X<sub>s</sub>''(''f''), of a properly sampled bandlimited signal (blue) and the adjacent DTFT images (green) that do not overlap. A ''brick-wall'' low-pass filter, ''H''(''f''), removes the images, leaves the original spectrum, ''X''(''f''), and recovers the original signal from its samples.
  • x}}.
THEOREM
Nyquist theorem; Shannon sampling theorem; Nyquist sampling theorem; Nyquist's theorem; Shannon-Nyquist sampling theorem; Nyquist-Shannon Sampling Theorem; Nyqvist-Shannon sampling theorem; Sampling theorem; Nyquist Sampling Theorem; Nyquist-Shannon sampling theorem; Nyquist–Shannon theorem; Nyquist–Shannon Theorem; Nyquist Theorem; Shannon-Nyquist theorem; Nyquist sampling; Nyquist's law; Nyquist law; Coherent sampling; Nyqvist limit; Raabe condition; Nyquist-Shannon Theorem; Nyquist-Shannon theorem; Nyquist noise theorem; Shannon–Nyquist theorem; Kotelnikov-Shannon theorem; Kotelnikov–Shannon theorem; Nyquist-Shannon; Kotelnikov theorem; Nyquist's sampling theorem; Sampling Theorem; Nyquist Shannon theorem; Nyquist–Shannon–Kotelnikov sampling theorem; Whittaker–Shannon–Kotelnikov sampling theorem; Whittaker–Nyquist–Kotelnikov–Shannon sampling theorem; Nyquist-Shannon-Kotelnikov sampling theorem; Whittaker-Shannon-Kotelnikov sampling theorem; Whittaker-Nyquist-Kotelnikov-Shannon sampling theorem; Cardinal theorem of interpolation; WKS sampling theorem; Whittaker–Kotelnikow–Shannon sampling theorem; Whittaker-Kotelnikow-Shannon sampling theorem; Nyquist–Shannon–Kotelnikov; Whittaker–Shannon–Kotelnikov; Whittaker–Nyquist–Kotelnikov–Shannon; Nyquist-Shannon-Kotelnikov; Whittaker-Shannon-Kotelnikov; Whittaker-Nyquist-Kotelnikov-Shannon; Whittaker–Shannon sampling theorem; Whittaker–Nyquist–Shannon sampling theorem; Whittaker-Nyquist-Shannon sampling theorem; Whittaker-Shannon sampling theorem

общая лексика

теорема отсчетов

overextension         
LINGUISTIC CONCEPT
Errors in Early Word Use; Developmental error; Overextension

медицина

избыточное разгибание

переразгибание

чрезмерное растяжение

overextension         
LINGUISTIC CONCEPT
Errors in Early Word Use; Developmental error; Overextension
1) завышенная оценка (напр. активов)
2) чрезмерное кредитование
3) бирж. покупка ценных бумаг или товаров на слишком большую сумму
4) см. over-expansion 2
statistical sampling         
  • A visual representation of selecting a random sample using the cluster sampling technique
  • A visual representation of selecting a random sample using the stratified sampling technique
  • A visual representation of selecting a random sample using the systematic sampling technique
SELECTION OF DATA POINTS IN STATISTICS
Sample (statistics); Statistical sampling; Sample survey; Random sampling; Random sample; Statistical sample; Sampling method; Sample population; Sample poppulation; Sampling (mathematics); Random allocation; Sample set; Representative sample; Sampling methods; Sample (probability); Sampling technique; Probability sample; Sampling techiques; Sampling techniques; Probability sampling; Sampling plan; Random sampling with replacement; Sampling (A level business); Random selection; Statistical Sample; Randomly selected; Unbiased sampling; Sampling Theory; Sampling scheme; Sample group; Data sample; Data sampling; N (statistics); With replacement; Without replacement; Double Labelling Experiment; Applications of statistical sampling; Random samples; Sample Surveys

математика

розыгрыш

sample survey         
  • A visual representation of selecting a random sample using the cluster sampling technique
  • A visual representation of selecting a random sample using the stratified sampling technique
  • A visual representation of selecting a random sample using the systematic sampling technique
SELECTION OF DATA POINTS IN STATISTICS
Sample (statistics); Statistical sampling; Sample survey; Random sampling; Random sample; Statistical sample; Sampling method; Sample population; Sample poppulation; Sampling (mathematics); Random allocation; Sample set; Representative sample; Sampling methods; Sample (probability); Sampling technique; Probability sample; Sampling techiques; Sampling techniques; Probability sampling; Sampling plan; Random sampling with replacement; Sampling (A level business); Random selection; Statistical Sample; Randomly selected; Unbiased sampling; Sampling Theory; Sampling scheme; Sample group; Data sample; Data sampling; N (statistics); With replacement; Without replacement; Double Labelling Experiment; Applications of statistical sampling; Random samples; Sample Surveys

общая лексика

выборочное обследование

Определение

грип
ГРИП, ГРИПП, гриппа, ·муж. (·франц. grippe) (мед.). Инфекционная болезнь - катарральное воспаление дыхательных путей, сопровождаемое лихорадочным состоянием; то же, что инфлуэнца
.

Википедия

Estimation of distribution algorithm

Estimation of distribution algorithms (EDAs), sometimes called probabilistic model-building genetic algorithms (PMBGAs), are stochastic optimization methods that guide the search for the optimum by building and sampling explicit probabilistic models of promising candidate solutions. Optimization is viewed as a series of incremental updates of a probabilistic model, starting with the model encoding an uninformative prior over admissible solutions and ending with the model that generates only the global optima.

EDAs belong to the class of evolutionary algorithms. The main difference between EDAs and most conventional evolutionary algorithms is that evolutionary algorithms generate new candidate solutions using an implicit distribution defined by one or more variation operators, whereas EDAs use an explicit probability distribution encoded by a Bayesian network, a multivariate normal distribution, or another model class. Similarly as other evolutionary algorithms, EDAs can be used to solve optimization problems defined over a number of representations from vectors to LISP style S expressions, and the quality of candidate solutions is often evaluated using one or more objective functions.

The general procedure of an EDA is outlined in the following:

t := 0
initialize model M(0) to represent uniform distribution over admissible solutions
while (termination criteria not met) do
    P := generate N>0 candidate solutions by sampling M(t)
    F := evaluate all candidate solutions in P
    M(t + 1) := adjust_model(P, F, M(t))
    t := t + 1

Using explicit probabilistic models in optimization allowed EDAs to feasibly solve optimization problems that were notoriously difficult for most conventional evolutionary algorithms and traditional optimization techniques, such as problems with high levels of epistasis. Nonetheless, the advantage of EDAs is also that these algorithms provide an optimization practitioner with a series of probabilistic models that reveal a lot of information about the problem being solved. This information can in turn be used to design problem-specific neighborhood operators for local search, to bias future runs of EDAs on a similar problem, or to create an efficient computational model of the problem.

For example, if the population is represented by bit strings of length 4, the EDA can represent the population of promising solution using a single vector of four probabilities (p1, p2, p3, p4) where each component of p defines the probability of that position being a 1. Using this probability vector it is possible to create an arbitrary number of candidate solutions.

Как переводится estimation of sampling errors на Русский язык