new
Markov chain的英文翻譯是什麼意思,詞典釋義與在線翻譯:
英英釋義
Noun:
-
a Markov process for which the parameter is discrete time values
Markov chain的用法和樣例:
例句
用作名詞 (n.)
- To analyze those algorithms, a new method which models point multiplication algorithms as Markov Chain is proposed in this paper.
為了分析這些演算法,文中提出了一種新的方法,即把橢圓曲線標量乘運算看作馬爾可夫鏈。 - Based on the Markov chain, this paper investigates the time variant system reliability of brittle structure under multiple time varying loads.
對由脆性材料組成的結構,應用馬爾可夫鏈分析了系統在時變載荷作用下的時變可靠性問題。