deep

2.4K posts

deep banner
deep

deep

@deephivex

floating point account

Pallete Town Beigetreten Şubat 2025
230 Folgt296 Follower
Angehefteter Tweet
deep
deep@deephivex·
manifesting an internship by the end of 2nd year
English
11
0
46
2.3K
deep
deep@deephivex·
everytime i come home I'm greeted with few days of compulsary cold and fever 😷🏳️
English
0
0
2
36
ShaRPeyE
ShaRPeyE@sharpeye_wnl·
trying to confirm if am shadow banned or not, if you see this say hi
English
70
0
142
4.6K
deep
deep@deephivex·
some mutual is talking like ruben veidt nowadays
English
4
0
14
442
deep
deep@deephivex·
@cneuralnetwork and bf was at iisc ( best physicist? / next nobel prize winner type somehting)
English
1
0
1
1.7K
neural nets.
neural nets.@cneuralnetwork·
idk if you remember but there used to be a girl who worked in adobe research & iisc & bigbasket in her first year in 2023/24 times randomly yaad aagya
English
17
3
324
26K
Jerkeyray
Jerkeyray@jerkeyray·
@deephivex same but that's exactly why it's effective 😭
English
1
0
1
25
Jerkeyray
Jerkeyray@jerkeyray·
interview strat: ask the dude to open his chatgpt history and go through how does he use ai for his work
English
7
0
9
343
knight
knight@knightkun__·
@deephivex did this mutual post an article today
English
1
0
1
113
deep
deep@deephivex·
@Zyara_1ot u poured yourself in the blog aah great one till date :⁠^⁠)
English
1
0
1
123
deep
deep@deephivex·
so, in practice, CNNs often combine both invariance and equivariance by using convolution layers to maintain shift-equivariance and pooling or striding layers to approximate shift invariance. thanks if u read this far :)
English
0
0
0
19
deep
deep@deephivex·
to infer relationships b/w feature maps that differ only by rotation. researchers have explored methods to mitigate this, such as using spherical harmonics to describe features in a rotation-invariant way, which helps the network to maintain stability despite i/p rotations
English
1
0
0
25
deep
deep@deephivex·
the central agenda and question to understand is to understand why CNNs are so effective at processing challenging data. this brings us to invariance and equivariance, which are essential for explaining how CNNs handle transformations like shifts in input data.
English
1
0
3
63