sampling process - definição. O que é sampling process. Significado, conceito
Diclib.com
Dicionário ChatGPT
Digite uma palavra ou frase em qualquer idioma 👆
Idioma:

Tradução e análise de palavras por inteligência artificial ChatGPT

Nesta página você pode obter uma análise detalhada de uma palavra ou frase, produzida usando a melhor tecnologia de inteligência artificial até o momento:

  • como a palavra é usada
  • frequência de uso
  • é usado com mais frequência na fala oral ou escrita
  • opções de tradução de palavras
  • exemplos de uso (várias frases com tradução)
  • etimologia

O que (quem) é sampling process - definição

ALGORITHM
Gibbs sampler; Gibbs Sampling; Gibbs point process; Collapsed Gibbs sampler; Collapsed Gibbs sampling; Blocked Gibbs sampling; Blocked Gibbs sampler
  • arxiv=2008.01006}}</ref>
  • Schematic description of the information equality associated with the Gibbs sampler at the i-th step within a cycle <ref name="Lee2008" />

Stochastic process rare event sampling         
SPRES; Stochastic Process Rare Event Sampling; S-PRES
Stochastic Process Rare Event Sampling (SPRES) is a Rare Event Sampling method in computer simulation, designed specifically for non-equilibrium calculations, including those for which the rare-event rates are time-dependent (non-stationary process). To treat systems in which there is time dependence in the dynamics, due either to variation of an external parameter or to evolution of the system itself, the scheme for branching paths must be devised so as to achieve sampling which is distributed evenly in time and which takes account of changing fluxes through different regions of the phase space.
Snowball sampling         
NONPROBABILITY SAMPLING TECHNIQUE
Snowball sample; Respondent-driven sampling; Snowball method; Snowballed sample
In sociology and statistics research, snowball sampling (or chain sampling, chain-referral sampling, referral sampling (accessed 8 May 2011).Snowball Sampling, Changing Minds.
Nyquist Theorem         
  • the sampled sequences are identical}}, even though the original continuous pre-sampled functions are not. If these were audio signals, <math>x(t)</math> and <math>x_A(t)</math> might not sound the same. But their samples (taken at rate ''f''<sub>s</sub>) are identical and would lead to identical reproduced sounds; thus ''x''<sub>A</sub>(''t'') is an alias of ''x''(''t'') at this sample rate.
  • The samples of two sine waves can be identical when at least one of them is at a frequency above half the sample rate.
  • A family of sinusoids at the critical frequency, all having the same sample sequences of alternating +1 and –1. That is, they all are aliases of each other, even though their frequency is not above half the sample rate.
  • Properly sampled image
  • Subsampled image showing a [[Moiré pattern]]
  • The figure on the left shows a function (in gray/black) being sampled and reconstructed (in gold) at steadily increasing sample-densities, while the figure on the right shows the frequency spectrum of the gray/black function, which does not change. The highest frequency in the spectrum is ½ the width of the entire spectrum. The width of the steadily-increasing pink shading is equal to the sample-rate. When it encompasses the entire frequency spectrum it is twice as large as the highest frequency, and that is when the reconstructed waveform matches the sampled one.
  • Spectrum, ''X<sub>s</sub>''(''f''), of a properly sampled bandlimited signal (blue) and the adjacent DTFT images (green) that do not overlap. A ''brick-wall'' low-pass filter, ''H''(''f''), removes the images, leaves the original spectrum, ''X''(''f''), and recovers the original signal from its samples.
  • x}}.
THEOREM
Nyquist theorem; Shannon sampling theorem; Nyquist sampling theorem; Nyquist's theorem; Shannon-Nyquist sampling theorem; Nyquist-Shannon Sampling Theorem; Nyqvist-Shannon sampling theorem; Sampling theorem; Nyquist Sampling Theorem; Nyquist-Shannon sampling theorem; Nyquist–Shannon theorem; Nyquist–Shannon Theorem; Nyquist Theorem; Shannon-Nyquist theorem; Nyquist sampling; Nyquist's law; Nyquist law; Coherent sampling; Nyqvist limit; Raabe condition; Nyquist-Shannon Theorem; Nyquist-Shannon theorem; Nyquist noise theorem; Shannon–Nyquist theorem; Kotelnikov-Shannon theorem; Kotelnikov–Shannon theorem; Nyquist-Shannon; Kotelnikov theorem; Nyquist's sampling theorem; Sampling Theorem; Nyquist Shannon theorem; Nyquist–Shannon–Kotelnikov sampling theorem; Whittaker–Shannon–Kotelnikov sampling theorem; Whittaker–Nyquist–Kotelnikov–Shannon sampling theorem; Nyquist-Shannon-Kotelnikov sampling theorem; Whittaker-Shannon-Kotelnikov sampling theorem; Whittaker-Nyquist-Kotelnikov-Shannon sampling theorem; Cardinal theorem of interpolation; WKS sampling theorem; Whittaker–Kotelnikow–Shannon sampling theorem; Whittaker-Kotelnikow-Shannon sampling theorem; Nyquist–Shannon–Kotelnikov; Whittaker–Shannon–Kotelnikov; Whittaker–Nyquist–Kotelnikov–Shannon; Nyquist-Shannon-Kotelnikov; Whittaker-Shannon-Kotelnikov; Whittaker-Nyquist-Kotelnikov-Shannon; Whittaker–Shannon sampling theorem; Whittaker–Nyquist–Shannon sampling theorem; Whittaker-Nyquist-Shannon sampling theorem; Whittaker-Shannon sampling theorem
<communications> A theorem stating that when an analogue waveform is digitised, only the frequencies in the waveform below half the sampling frequency will be recorded. In order to reconstruct (interpolate) a signal from a sequence of samples, sufficient samples must be recorded to capture the peaks and troughs of the original waveform. If a waveform is sampled at less than twice its frequency the reconstructed waveform will effectively contribute only noise. This phenomenon is called "aliasing" (the high frequencies are "under an alias"). This is why the best digital audio is sampled at 44,000 Hz - twice the average upper limit of human hearing. The Nyquist Theorem is not specific to digitised signals (represented by discrete amplitude levels) but applies to any sampled signal (represented by discrete time values), not just sound. {Nyquist (http://geocities.com/bioelectrochemistry/nyquist.htm)} (the man, somewhat inaccurate). (2003-10-21)

Wikipédia

Gibbs sampling

In statistics, Gibbs sampling or a Gibbs sampler is a Markov chain Monte Carlo (MCMC) algorithm for obtaining a sequence of observations which are approximated from a specified multivariate probability distribution, when direct sampling is difficult. This sequence can be used to approximate the joint distribution (e.g., to generate a histogram of the distribution); to approximate the marginal distribution of one of the variables, or some subset of the variables (for example, the unknown parameters or latent variables); or to compute an integral (such as the expected value of one of the variables). Typically, some of the variables correspond to observations whose values are known, and hence do not need to be sampled.

Gibbs sampling is commonly used as a means of statistical inference, especially Bayesian inference. It is a randomized algorithm (i.e. an algorithm that makes use of random numbers), and is an alternative to deterministic algorithms for statistical inference such as the expectation-maximization algorithm (EM).

As with other MCMC algorithms, Gibbs sampling generates a Markov chain of samples, each of which is correlated with nearby samples. As a result, care must be taken if independent samples are desired. Generally, samples from the beginning of the chain (the burn-in period) may not accurately represent the desired distribution and are usually discarded.