Train the network using Cross entropy loss See https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html for details. It expects an input of size Nsamples x Nclasses, which is un-normalized logits, and a target, which is y_train_l of size Nsamplesx1. You may also feed it y_train_T of size Nsample x Nclasses. Please see the documentation. Cross entropy loss expects raw unnrormalized scores. Soft-max converts raw unnormalized scores to probabilities, which are used to plot the labels. Use SGD and run 20,000 epochs using a learning rate of 1e-2 to train the neural network nn = NeuralNet(Nfeatures,Nclasses,20,20).to(device) sm = N.Softmax(dim=1) # Weight the cross entropy loss to balance the classes Nsamples_per_class = y_train_T.sum(axis=0) Weight = Nsamples_per_class.sum()/Nsamples_per_class loss = torch.nn.CrossEntropyLoss(weight=Weight) learning_rate = 0.01 #YOUR CODE HERE optimizer= optim.SGD(net.parameters(), lr=learning_rate) # define optimizer for epoch in range(20000): #YOUR CODE HERE predNN = ?? # Forward pass error = ?? # find the loss optimizer.zero_grad() # clear the gradients backward.loss() # Send loss backward optimizer.step() # update weights if(np.mod(epoch,5000)==0): print("Error =",error.detach().cpu().item()) fig,ax = plt.subplots(1,2,figsize=(12,4)) ax[0].plot(y_train_T[0:40].detach().cpu()) ax[1].plot(sm(predNN[0:40]).detach().cpu()) plt.show()

Computer Networking: A Top-Down Approach (7th Edition)
7th Edition
ISBN:9780133594140
Author:James Kurose, Keith Ross
Publisher:James Kurose, Keith Ross
Chapter1: Computer Networks And The Internet
Section: Chapter Questions
Problem R1RQ: What is the difference between a host and an end system? List several different types of end...
icon
Related questions
Question
100%

Train the network using Cross entropy loss

See https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html for details. It expects an input of size Nsamples x Nclasses, which is un-normalized logits, and a target, which is y_train_l of size Nsamplesx1. You may also feed it y_train_T of size Nsample x Nclasses. Please see the documentation.

Cross entropy loss expects raw unnrormalized scores. Soft-max converts raw unnormalized scores to probabilities, which are used to plot the labels.

Use SGD and run 20,000 epochs using a learning rate of 1e-2 to train the neural network

nn = NeuralNet(Nfeatures,Nclasses,20,20).to(device)
sm = N.Softmax(dim=1)

# Weight the cross entropy loss to balance the classes
Nsamples_per_class = y_train_T.sum(axis=0)
Weight = Nsamples_per_class.sum()/Nsamples_per_class
loss = torch.nn.CrossEntropyLoss(weight=Weight)
learning_rate = 0.01
#YOUR CODE HERE
optimizer= optim.SGD(net.parameters(), lr=learning_rate) # define optimizer

for epoch in range(20000):
    
    #YOUR CODE HERE
    predNN = ??   # Forward pass
    error = ??    # find the loss
    optimizer.zero_grad()              # clear the gradients
    backward.loss()              # Send loss backward
    optimizer.step()              # update weights 

    if(np.mod(epoch,5000)==0):
      print("Error =",error.detach().cpu().item())
      fig,ax = plt.subplots(1,2,figsize=(12,4))
      ax[0].plot(y_train_T[0:40].detach().cpu())
      ax[1].plot(sm(predNN[0:40]).detach().cpu())
      plt.show()

Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 4 steps with 1 images

Blurred answer
Knowledge Booster
Disjoint Set forest
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, computer-engineering and related others by exploring similar questions and additional content below.
Recommended textbooks for you
Computer Networking: A Top-Down Approach (7th Edi…
Computer Networking: A Top-Down Approach (7th Edi…
Computer Engineering
ISBN:
9780133594140
Author:
James Kurose, Keith Ross
Publisher:
PEARSON
Computer Organization and Design MIPS Edition, Fi…
Computer Organization and Design MIPS Edition, Fi…
Computer Engineering
ISBN:
9780124077263
Author:
David A. Patterson, John L. Hennessy
Publisher:
Elsevier Science
Network+ Guide to Networks (MindTap Course List)
Network+ Guide to Networks (MindTap Course List)
Computer Engineering
ISBN:
9781337569330
Author:
Jill West, Tamara Dean, Jean Andrews
Publisher:
Cengage Learning
Concepts of Database Management
Concepts of Database Management
Computer Engineering
ISBN:
9781337093422
Author:
Joy L. Starks, Philip J. Pratt, Mary Z. Last
Publisher:
Cengage Learning
Prelude to Programming
Prelude to Programming
Computer Engineering
ISBN:
9780133750423
Author:
VENIT, Stewart
Publisher:
Pearson Education
Sc Business Data Communications and Networking, T…
Sc Business Data Communications and Networking, T…
Computer Engineering
ISBN:
9781119368830
Author:
FITZGERALD
Publisher:
WILEY