 
 


 #1
      ,             .      ,           .      ,   ,   ,      . " "      ,         .





 

 





 1:     



1.1.        

       ,     .      ,        .          ,     ,  ,      .

    :

   ,     ,   .      (Convolutional Neural Networks, CNNs)     . ,      ,   ,           .

     :

  ,     ,    .    (Recurrent Neural Networks, RNNs)   (Transformers)   ,   ,  -   . ,                     .

   :

        ,      . ,        (,    )      .                 .

    :

         ,     . ,                  .          .

    :

       ,      .   ,     ,     ,        .    ,            .

               .        ,  ,  ,      .                  .


1.2.     : , , ,  

      ,          .    :

:

      .        ,    ,         .        :

   (Fully Connected Layers): 

 ,       (Dense Layers)     ,          .            .

          .  ,                .            .

         :    .

1.  :

  ,     ,    .     .        .

,         :

z = w1*x1 + w2*x2 +  + wn*xn + b

:

z        .

x1, x2, , xn    .

w1, w2, , wn  ,    .

b   (bias),     .

2. :

   ,   z    .      ,     .

 ,   , ReLU,    ,    z   ,      .

,   ,        .       ,         .

         ,        .        ,             ,   , ,      .

   (Convolutional Layers): 

  (Convolutional Layers)        (Convolutional Neural Networks, CNN)          .           .

        ( )          .      ,        (  )   .                    .

     ,          ,      .      ,   , ,     ,      .  ,      ,  ,            .

 ,    , :

1.  : ,           .         .

2.  :   ,       . ,   3x3     3x3    .

3.   (Stride): ,        .        .

4.  (Padding):       .       ,      .

5.  :         .     ReLU (Rectified Linear Unit), sigmoid  tanh.

6.  (Pooling):                 .       (max pooling)    (average pooling).

       ,   ,  ,    .      ,             .

   (Recurrent Layers):

  (Recurrent Layers)      ,    ,      .         ,        .

      ,           ,     ,     .        (hidden state),       .

         :

1.    (Hidden State Update):

         ,    .    ,           .

2.       (Hidden State Passing):

      ,          .

 ,          ,   , ,    .                  .

,   ,      (SimpleRNN),      (vanishing gradient problem)    (exploding gradient problem),           .          ,   LSTM (Long Short-Term Memory)  GRU (Gated Recurrent Unit),     .

      ,   ,  ,   ,   ,     .               ,   .

 :

     ,       .         .     :

  (Sigmoid):

  (Sigmoid)             .    S-          0  1.      :

?(x) = 1 / (1 + exp(-x))

 x    , exp   .

       ,         (0, 1).            ,        . ,              .

,     ,       .  ,       (vanishing gradient problem).    ,      , ,     ,   .           .

-                   ,   ReLU (Rectified Linear Unit)   . ReLU          .

  ,         ,   ,       (0, 1)     .         ,      ,       (0, 1).

  (Tanh):

  (Tanh)    ,         .    ,    -1  1.      :

tanh(x) = (exp(x)  exp(-x)) / (exp(x) + exp(-x))

 x    , exp   .

  ,       ,      .   S-         (-1, 1).

       ,   .    , ,     ,  ,      .

,        . -,     ,  -1  1,       . ,  ,       ,    .

 ,            ,            .

    ,      , , ,   ReLU   . ,         ,         .

 ReLU (Rectified Linear Unit):

 ReLU (Rectified Linear Unit)           .   0            .   ReLU   :

ReLU(x) = max(0, x)

 x    .

     ReLU      .  ReLU       ,    .    ReLU      .

 ReLU      ,        .    ReLU,      ,            .

    ,  ReLU        ,    .       ,     (Convolutional Neural Networks)   ,    (Recurrent Neural Networks)        (Fully Connected Neural Networks)     .

    ReLU,     ,   Leaky ReLU, Parametric ReLU  Exponential ReLU.        ReLU       ,   ""  (dead neurons)   .   (Linear):

     .    , ,   .

:

       .      ,     ,          .

    ,     ,          .    ,    ,            .

          (Stochastic Gradient Descent, SGD).         ,    .    SGD       (  -)        .      ,  ,   ,    (learning rate).

     Adam (Adaptive Moment Estimation)  RMSprop (Root Mean Square Propagation). Adam      ,  SGD      .         ,      . RMSprop     ,          .

 ,     ,            .     Adagrad, Adadelta, Adamax, Nadam  .                .

                 .       ,         .

 :

  (  )       ,             ,      .                  .

       ,    .     :

1.   (Mean Squared Error, MSE):        ,     .          .

2. -   (Cross-Entropy Loss):        ,        .         .

3.  - (Binary Cross-Entropy):       ,        .            .

4.  - (Categorical Cross-Entropy):       ,        .             .

   ,           . ,         Dice Loss,     ,     (GAN),    adversarial loss.

            ,              .

            .  ,  ,            .


1.3.      ,   TensorFlow  PyTorch

     ,   TensorFlow  PyTorch,        ,          .     .

1. TensorFlow:

TensorFlow                 .     TensorFlow:

 : TensorFlow     ,    ,     .        .

 : TensorFlow       ,      .

: TensorFlow                            .

   : TensorFlow      ,  Python, C++, Java  .

2. PyTorch:

PyTorch       ,       .     PyTorch:

 :    TensorFlow, PyTorch   ,            .       .

 : PyTorch      ,          .      API,      .

 : PyTorch   ,           .        ,   ,       .

 GPU: PyTorch       (GPU),         .

 , TensorFlow  PyTorch,              .            ,       .




 2:  



2.1. ,         

,               .      :

1.  : 

         ,    ,  CSV,  ,     .   ,       .

,      ,    SQL-     .     ,     .

    CSV   ,        ,   pandas  Python.       ,   DataFrame,       .

  ,   ,  OpenCV  PIL,        .               ,      .

    API (Application Programming Interface),  ,    ,         . API    -        .

      SDK (Software Development Kit)     API.   SDK   ,   ,      API   .

    ,    API   ,          . ,        ,    API             .

   API      :

:   ,      ,    API   .      API.

 :     API,     .    HTTP-   ,  URL /  .

 :     SDK     API.     ,  ,      .

 :      API.             .

 API                 ,         .

2.   : 

    ,      .     ,    , ,    .            ,               .

      :

 :         .     -       .  ,          ,    .          ,       ,       .

:    ,      .      ,       .     ,      ,      .      ,      ,       ,     .

   :      ,    . ,      (,     ,      ),           .     ,           .

         ,   ,  ,      .           ,       .

3.  : 

        ,        .    ,       :

  :         ,             .       . ,       ,      .

    :        . ,       ,       ,       .         . ,      ,          .

  :      ,        .       ,    ,     ,       .

 :        .             . ,           0  1          .

   :         .      ,       .     ,           .

            .       ,           .

4.  : 

             .      :

  :  ,     (, , ),  (, , )    (, ,  ),        ,     .              "one-hot encoding" ( ).

  "one-hot encoding"        ,       .       ,  ,      . ,   " "   (, , ),     :

: [1, 0, 0]

: [0, 1, 0]

: [0, 0, 1]

 ,      ,     0  1.             .

 "one-hot encoding"   ,         ,      .               .

  "one-hot encoding"                    .

         pandas  Python.

```python

import pandas as pd

#    

data = pd.DataFrame({' ': ['', '', '', '', '']})

#  one-hot encoding    get_dummies()

encoded_data = pd.get_dummies(data[' '])

#       

final_data = pd.concat([data, encoded_data], axis=1)

#   

print(final_data)

```

:

```

    

0  0 0 1

1  0 1 0

2  1 0 0

3  0 1 0

4  0 0 1

```

   ,    " "        one-hot encoding.  1       ,   0      .

   : 

,              .       :   .

 (Standardization):

      0    1.             .      x   :

x_standardized = (x  mean) / std

 mean    , std    .

 (Normalization):

      0  1.  ,           .      x   :

x_normalized = (x  min) / (max  min)

 min    , max    .

 Python   ,   scikit-learn,       .      scikit-learn   :

```python

from sklearn.preprocessing import StandardScaler

#   StandardScaler

scaler = StandardScaler()

#     

scaled_data = scaler.fit_transform(data)

```

      scikit-learn   .      scikit-learn     .  :             .                  .          .

   :

,                .     ,       ,    .

              (word embeddings). Word embeddings      ,    .              .

        Word2Vec,          .          ,         .             ,     .

  ,        . ,       ,             .       ,   ,    .

     ,        .                   .

5.    ,    : 

   ,           .      :

  (Training Set):

  ,    .

       .

 ""  ""   ,      .

           .

  (Validation Set):

  ,      .

,    ,     ,    ""       .

       ,     ,     .

     ,        ,      .

  (Test Set):

  ,       .

   ,           ,    .

                  .

                .

              .          ,       .

   ,          :

 :

         , , 70%   , 15%     15%   .

          ,   scikit-learn (Python)  caret (R).

  (Cross-validation):

     (, 5  10),         ,      .

  ,         .

       .

        ,     .

 :

    ,    (,  ),    .

        ,         ,           ,    .

 ,          (     )  ,      .      ,     ,    .

  ,   scikit-learn  Python,           ,    .

      ,    .

1.  :

```python

from sklearn.model_selection import train_test_split

#  

X, y = load_data()

#    ,    

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=42)

X_val, X_test, y_val, y_test = train_test_split(X_test, y_test, test_size=0.5, random_state=42)

#    

print("  :", X_train.shape)

print("  :", X_val.shape)

print("  :", X_test.shape)

```

       (70%),  (15%)   (15%) .  `train_test_split`   scikit-learn     .  `test_size`      ,   `random_state`       ,    .

2.   (Cross-validation):

```python

from sklearn.model_selection import cross_val_score

from sklearn.linear_model import LinearRegression

#  

X, y = load_data()

#  

model = LinearRegression()

#   

scores = cross_val_score(model, X, y, cv=5) # 5 

#  

print("  :", scores)

print("  :", scores.mean())

```

      5  ().            .  `cross_val_score`   scikit-learn   ,      .  ,             .

3.  :

```python

#   

X, y = load_temporal_data()

#    

train_size = int(0.7 * len(X))

val_size = int(0.15 * len(X))

X_train, y_train = X[:train_size], y[:train_size]

X_val, y

_val = X[train_size:train_size+val_size], y[train_size:train_size+val_size]

X_test, y_test = X[train_size+val_size:], y[train_size+val_size:]

#    

print("  :", X_train.shape)

print("  :", X_val.shape)

print("  :", X_test.shape)

```

       (70%),  (15%)   ( )    .     ,         .      ,       ,          ,           ,    .

        ,    ,       .         (     )  ,      .

6.   : 

,            .     -  ,      ,        .       :

  :           .     ,       .

```python

import pandas as pd

#  

data = pd.read_csv('data.csv')

#     

data_filled = data.fillna(data.mean())

```

 :          .      ,      ,   .

```python

import pandas as pd

#  

data = pd.read_csv('data.csv')

#    

data_filled = data.fillna(data.median())

```

   :             .     .

```python

import pandas as pd

#  

data = pd.read_csv('data.csv')

#      

data_filled = data.fillna(data.mode().iloc[0])

```

         .      ,           .

7.  :

 (features)     ,          .    ,     ,          .

     ,        .              .

,    ,      ,    ,   ,      .

     ,      ,   .   ,        , ,    "one-hot encoding",        .

   ,             .         ,              .

              . ,         ,   Word2Vec  GloVe,      ,    .       ,   .

      ,      (CNN),     .       ,    .             .

       ,    ,  ,   . .        ,      .

   ,           ,   .          ,         .                .

  ?

          .   ,       :

1.  :         ,          .     ,        .

2.  :     ,    ,   . ,             ,   .      .

3.  :         ,           .            ,    .

4. :           .    ,    ,     ,  ,       .

5.   :      ,      (Recursive Feature Elimination),      (Feature Importance),     ,   Lasso  Ridge .          .

6.   :       ,    ,          ,       .    ,      ,    .

      :

1. :        .

2.  :            ,   ,   .    ,  ,          .

3.  :    ,        , , ,   ..  ,            ,    .

4.  :      "mentions_quality",    1,       ,  0   .            .

5. :           "mentions_quality",    .           .

6.  :  ,    "mentions_quality"   ,         ,         .

 ,  "mentions_quality"            .

  ,         .    ,       ,         .

                .           .

            :

1.    (Word Embeddings):

     .

    .

      (Natural Language Processing, NLP).

2. One-Hot Encoding:

     .

      .

      .

3.  (Scaling):

       .

     0    1.

     0  1.

     .

4.   :

      .

   ,    .

        .

5.  :

   ,     .

       .

6.   :

       .

      .

     .

7.  :

       .

  , ,     .

            .

                 .                .


2.2.     ,   , ,    

    ,   , ,    ,      .          .

1.  :

 :      ,  -,   .

             .            .      :

 :       ,      . ,    ,    .

       .     ,    . ,  ",  ?"     ["", ",", "", "", "?"].

 -: -    ,        ,   ,   .  -         .

       (). ,  "", ""  ""     "".         ,     .

 :                . ,                 .

         ,        .            ,   .

   (word embeddings):

  ,    word embeddings,       .       ,        ,   .

      ,       . ,        ,     .              .

      ,         Word2Vec  GloVe.

Word2Vec: Word2Vec  ,             .     ,    ,          . Word2Vec   : Continuous Bag of Words (CBOW)  Skip-gram.

GloVe: GloVe (Global Vectors for Word Representation)       .               .   GloVe   ,              .

 , Word2Vec  GloVe,      ,         .             , ,     ,      .

,     ,          Word2Vec.   :

:

1. "    ."

2. "    ."

        Word2Vec:

:      .

:

 1: ["", "", "", "", ""]

 2: ["", "", "", "", ""]

  Word2Vec:   Gensim    Word2Vec    .     100     5.

  Python:

```python

from gensim.models import Word2Vec

sentences = [["", "", "", "", ""],

["", "", "", "", ""]]

model = Word2Vec(sentences, size=100, window=5)

```

   :           .

  Python:

```python

vector_pizza = model.wv[""]

vector_books = model.wv[""]

print("   '':")

print(vector_pizza)

print("\n   '':")

print(vector_books)

```

:

```

   '':

[0.12345678, -0.23456789, ] (  100)

   '':

[0.98765432, -0.87654321, ] (  100)

```

       ""  "",    .                    .

   (RNN)     (CNN):    (RNN)     (CNN)     ,          .

   (RNN):

: RNN  ,     ,         .   "",        .

   : RNN      ,    ,  ,      .            .

   (CNN):

: CNN  ,       ,     .            .

   : CNN       ,        .         ,            .

             .   RNN  CNN    ,     .       RNN  CNN,     .

2. :

  : , ,    .

                 .     :

 (Scaling):       .       ,      .           .

 (Cropping):       .        . ,        ,      .

  (Resizing):       ,          .          ,   .

 (Normalization):          . ,     ,       0  1   -1  1.        .

   :

1.  (Scaling):

   Python       PIL (Python Imaging Library):

```python

from PIL import Image

def scale_image(image, new_size):

resized_image = image.resize(new_size)

return resized_image

image = Image.open('image.jpg')

scaled_image = scale_image(image, (224, 224))

scaled_image.show()

```

      `scale_image`,         .    `resize`   PIL    .       `Image.open`    `scale_image`      224x224 .       `show`.

2.  (Cropping):

   Python       PIL:

```python

from PIL import Image

def crop_image(image, new_size):

width, height = image.size

left = (width  new_size[0]) // 2

top = (height  new_size[1]) // 2

right = left + new_size[0]

bottom = top + new_size[1]

cropped_image = image.crop((left, top, right, bottom))

return cropped_image

image = Image.open('image.jpg')

cropped_image = crop_image(image, (200, 200))

cropped_image.show()

```

      `crop_image`,         .      ,       .       `Image.open`    `crop_image`      200x200 .       `show`.

3.   (Resizing):

   Python        PIL:

```python

from PIL import Image

def resize_image(image, new_size):

resized_image = image.resize(new_size)

return resized_image

image = Image.open('image.jpg')

resized_image = resize_image(image, (500, 500))

resized_image.show()

```

      `resize_image`,         .    `resize`   PIL    .       `Image.open`    `resize_image`       500x500 .        `show`.

4.  (Normalization):

   Python       NumPy:

```python

import numpy as np

from PIL import Image

def normalize_image(image):

normalized_image = (image  np.min(image)) / (np.max(image)  np.min(image))

return normalized_image

image = np.array(Image.open('image.jpg'))

normalized_image = normalize_image(image)

```

      `normalize_image`,       NumPy   .                     .       `Image.open`,     NumPy   `np.array`,    `normalize_image`   .

             .       ,     ,      .

   (CNN):        .            .

   (Convolutional Neural Networks, CNN)         .      ,   , ,     .         :

1.   (Convolutional Layers):       CNN.    ( )       .        (stride),  ,      (feature map).      ,   ,   .

2.   (Pooling Layers):            .          (Average Pooling)     (Max Pooling).               .

3.   (Fully Connected Layers):     ,    ,        .              ,     .

4.   (Activation Functions):              .     CNN  ReLU (Rectified Linear Unit),    ,  softmax,        .

           ,   ()       (Backpropagation)    .  CNN      ,         .

 :

1.    (Convolutional Layer):

```python

import tensorflow as tf

#     32   3x3

conv_layer = tf.keras.layers.Conv2D(filters=32, kernel_size=(3, 3), activation='relu', input_shape=(64, 64, 3))

#      

output = conv_layer(input_data)

```

:        32   3x3.     ReLU   .    3-   64x64 .      ,      `output`.

2.    (Pooling Layer):

```python

import tensorflow as tf

#       2x2

pooling_layer = tf.keras.layers.MaxPooling2D(pool_size=(2, 2))

#      

output = pooling_layer(input_data)

```

:          2x2.            2x2      .         .      ,      `output`.

3.    (Fully Connected Layer):

```python

import tensorflow as tf

#     256 

dense_layer = tf.keras.layers.Dense(units=256, activation='relu')

#      

output = dense_layer(input_data)

```

:        256 .     ReLU   .       .      ,      `output`.

4.    (Activation Function):

```python

import tensorflow as tf

#     ReLU

output = tf.keras.activations.relu(input_data)

#     softmax

output = tf.keras.activations.softmax(input_data)

```

:         .       ReLU    `input_data`.   ReLU   ,     ,    .       softmax    `input_data`.   softmax      ,            .

 ,        TensorFlow      .                  .




  .


   .

   ,     (https://www.litres.ru/book/dzheyd-karter/neyroseti-praktika-69414148/chitat-onlayn/)  .

      Visa, MasterCard, Maestro,    ,   ,     ,  PayPal, WebMoney, ., QIWI ,       .


