Skip to content

Conversation

@SourceryAI
Copy link

Thanks for starring sourcery-ai/sourcery ✨ 🌟 ✨

Here's your pull request refactoring your most popular Python repo.

If you want Sourcery to refactor all your Python repos and incoming pull requests install our bot.

Review changes via command line

To manually merge these changes, make sure you're on the master branch, then run:

git fetch https://github.com/sourcery-ai-bot/xeno master
git merge --ff-only FETCH_HEAD
git reset HEAD^


def derivative(self, input=None):
last_forward = input if input else self.last_forward
last_forward = input or self.last_forward
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function ReLU.derivative refactored with the following changes:


def derivative(self, input=None):
last_forward = input if input else self.last_forward
last_forward = input or self.last_forward
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Linear.derivative refactored with the following changes:

exp_x = np.exp(x)
s = exp_x / np.sum(exp_x, axis=1, keepdims=True)
return s
return exp_x / np.sum(exp_x, axis=1, keepdims=True)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Softmax.forward refactored with the following changes:

Comment on lines -100 to +99
last_forward = input if input else self.last_forward
last_forward = input or self.last_forward
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Softmax.derivative refactored with the following changes:

fan_out = size[1]

elif len(size) == 4 or len(size) == 5:
elif len(size) in [4, 5]:
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function decompose_size refactored with the following changes:

while iter_idx < max_iter:
iter_idx += 1

for iter_idx in range(1, max_iter + 1):
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Model.fit refactored with the following changes:

Comment on lines -145 to +142
y_pred = x_next
return y_pred
return x_next
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Model.predict refactored with the following changes:

linear_out = np.dot(input, self.W) + self.b
act_out = self.act_layer.forward(linear_out)
return act_out
return self.act_layer.forward(linear_out)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Dense.forward refactored with the following changes:

Comment on lines -131 to +133
if train:
self.last_mask = get_rng().binomial(1, 1 - self.p, input.shape) / (1 - self.p)
return input * self.last_mask
else:
if not train:
return input * (1 - self.p)
self.last_mask = get_rng().binomial(1, 1 - self.p, input.shape) / (1 - self.p)
return input * self.last_mask
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Dropout.forward refactored with the following changes:

dx = dx1 + dx2

return dx
return dx1 + dx2
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function BatchNormal.backward refactored with the following changes:

This removes the following comments ( why? ):

# step0 done!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant