[FEATURE] Fusing Leaky Relu operator in Fully Connected#19971
[FEATURE] Fusing Leaky Relu operator in Fully Connected#19971szha merged 1 commit intoapache:v1.xfrom
Conversation
|
Hey @agrabows , Thanks for submitting the PR
CI supported jobs: [clang, sanity, centos-cpu, centos-gpu, miscellaneous, edge, website, windows-cpu, windows-gpu, unix-gpu, unix-cpu] Note: |
4e8806d to
dbd09ba
Compare
|
@mxnet-bot run ci [unix-gpu] |
|
Jenkins CI successfully triggered : [unix-gpu] |
| if (param.act_type == leakyrelu::kLeakyReLU || | ||
| param.act_type == leakyrelu::kELU || | ||
| param.act_type == leakyrelu::kGELU) { | ||
| matched_list_.push_back(&new_node); | ||
| status_ = kSuccess; | ||
| return true; | ||
| } |
There was a problem hiding this comment.
The full set of supported activation types are {'elu', 'gelu', 'leaky', 'prelu', 'rrelu', 'selu'}. What about the rest of the activation types? Will they be supported by onednn in the future?
There was a problem hiding this comment.
Those were activation types that can be easily fused with FC in this PR. We plan to enable other activation fuses in the next PR as they need different approach.
Description
This change will enable fusing Leaky Relu operator and Fully Connected into single operator supported by oneDNN backend for both int8 and fp32 data types.
Checklist
Essentials
Changes