Lines Matching defs:gather

1508   /// Perform LICM and CSE on the newly generated gather sequences.
1523 /// Checks if the specified gather tree entry \p TE can be represented as a
1603 /// It may happen, if all gather nodes are loads and they cannot be
1627 /// scatter or just simple gather.
1633 /// vectorization sequences rather than masked gather.
1634 /// \param TryRecursiveCheck used to check if long masked gather can be
1721 /// A load candidate for masked gather.
2982 /// \param ReorderableGathers List of all gather nodes that require reordering
2983 /// (e.g., gather of extractlements or partially vectorizable loads).
2984 /// \param GatherOps List of gather operand nodes for \p UserTE that require
2992 /// Checks if the given \p TE is a gather node with clustered reused scalars
2997 /// if any. If it is not vectorized (gather node), returns nullptr.
3018 /// if any. If it is not vectorized (gather node), returns nullptr.
3104 /// vector type and gather such instructions into a bunch, which highly likely
3113 /// vector type and gather such instructions into a bunch, which highly likely
3168 gather(ArrayRef<Value *> VL, Value *Root, Type *ScalarTy,
3288 /// Checks if the current node is a gather node.
3300 /// Do we need to gather this sequence or vectorize it
3301 /// (either with vector instruction or with scatter/gather
3305 ScatterVectorize, ///< Masked scatter/gather node.
3343 /// For gather/buildvector/alt opcode (TODO) nodes, which are combined from
3618 "Need to vectorize gather entry?");
3781 /// pre-gather them before.
3784 /// List of gather nodes, depending on other gather/vector nodes, which should
3794 /// strided or masked gather approach, but attempted to be represented as
4636 assert(TE.isGather() && "Expected gather node only.");
5163 // strided/masked gather loads. Returns true if vectorized + shuffles
5164 // representation is better than just gather.
5170 // Compare masked gather cost and loads + insert subvector costs.
5175 // Estimate the cost of masked gather GEP. If not a splat, roughly
5203 // The cost of masked gather.
5214 // compare masked gather cost and gather cost.
5229 // shuffles is better than just gather.
5251 // If need the reorder - consider as high-cost masked gather for now.
5338 // If masked gather cost is higher - better to vectorize, so
5339 // consider it as a gather node. It will be better estimated
5351 // GEPs or have > 2 operands, we end up with a gather node, which just
5364 // Check if potential masked gather can be represented as series
5366 // If masked gather cost is higher - better to vectorize, so
5367 // consider it as a gather node. It will be better estimated
5485 assert(TE.isGather() && "Expected gather node only.");
5803 // TODO: add analysis of other gather nodes with extractelement
5812 // Check that gather of extractelements can be represented as
5820 // If the gather node is <undef, v, .., poison> and
5949 // ExtractElement gather nodes which can be vectorized and need to handle
7101 // Erase last masked gather candidate, if another candidate within
7313 // Try to build long masked gather loads.
7649 // Too many operands - gather, most probably won't be vectorized.
8247 // indices, otherwise we should gather them, not try to vectorize.
8510 // This is a special case, as it does not gather, but at the same time
8569 // Vectorizing non-consecutive loads with `llvm.masked.gather`.
8576 // Vectorizing non-consecutive loads with `llvm.masked.gather`.
8753 // Required to be able to find correct matches between different gather
8754 // nodes and reuse the vectorized values rather than trying to gather them
9551 // into masked gather load intrinsic.
9585 "Expected gather node without reordering.");
9793 // Try to reorder gather nodes for better vectorization opportunities.
9951 // No need to reorder masked gather loads, just reorder the scalar
10107 // Try to vectorize gathered loads if this is not just a gather of loads.
10737 // Check that gather of extractelements can be represented as just a
10870 Value *gather(ArrayRef<Value *> VL, unsigned MaskVF = 0,
11687 // If this node generates masked gather load then it is not a terminal node.
11969 // with the second gather nodes if they have less scalar operands rather than
11970 // the initial tree element (may be profitable to shuffle the second gather)
12097 // Check if any of the gather node forms an insertelement buildvector
12466 // Some gather nodes might be absolutely the same as some vectorizable
12475 // Exclude cost of gather loads nodes which are not used. These nodes were
12478 "Expected gather nodes with users only.");
12939 /// vector type and gather such instructions into a bunch, which highly likely
13013 // Check that gather of extractelements can be represented as just a
13041 /// vector type and gather such instructions into a bunch, which highly likely
13101 // around). The other node is not limited to be of a gather kind. Gather
13116 // Check the order of the gather nodes users.
13146 "Expected only single user of a gather node.");
13224 // fallback to the regular gather.
13248 // Try to find the perfect match in another gather node at first.
13592 // No need to check for the topmost gather node.
13607 "Expected only single user of the gather node.");
13916 Value *BoUpSLP::gather(
14353 // Postpone gather emission, will be emitted after the end of the
14490 Value *gather(ArrayRef<Value *> VL, unsigned MaskVF = 0,
14492 return R.gather(VL, Root, ScalarTy,
14703 // Need to update the operand gather node, if actually the operand is not a
14704 // vectorized node, but the buildvector/gather node, which matches one of
14714 assert(It != VectorizableTree.end() && "Expected gather node operand.");
14720 // Find the corresponding gather entry and vectorize it.
14729 "Expected only single user for the gather node.");
14737 assert(E->isGather() && "Expected gather node.");
14860 // Postpone gather emission, will be emitted after the end of the
14899 // Postpone gather emission, will be emitted after the end of the
14908 LLVM_DEBUG(dbgs() << "SLP: perfect diamond match for gather bundle "
15182 Value *BV = ShuffleBuilder.gather(GatheredScalars, BVMask.size());
15197 Vec = ShuffleBuilder.gather(NonConstants, Mask.size(), Vec);
15203 Value *BV = ShuffleBuilder.gather(GatheredScalars, ReuseMask.size());
15214 Value *BV = ShuffleBuilder.gather(GatheredScalars);
16223 // Also, gather up main and alt scalar ops to propagate IR flags to
16326 // Found gather node which is absolutely the same as one of the
16338 // The is because source vector that supposed to feed this gather node was
16381 // Scan through gather nodes.
16449 assert(!E->isGather() && "Extracting from a gather list");
16931 << " gather sequences instructions.\n");
17022 // Perform O(N^2) search over the gather/shuffle sequences and merge identical
17031 // For all instructions in blocks containing gather sequences:
18160 // Check if the root is trunc and the next node is gather/buildvector, then
18545 // is primarily intended to catch gather-like idioms ending at
18632 Size = 2; // cut off masked gather small trees
19672 // gather all the reduced values, sorting them by their value id.
19925 // List of the values that were reduced in other trees as part of gather
21873 // gather-like cases of the form: