未验证 提交 ab94feac 编写于 作者: Z Zhenxu Ke 提交者: GitHub

Merge branch 'master' into issue/5860

...@@ -12,7 +12,7 @@ ___ ...@@ -12,7 +12,7 @@ ___
___ ___
### Bug ### Bug
- Which version of SkyWalking, OS and JRE? - Which version of SkyWalking, OS, and JRE?
- Which company or project? - Which company or project?
......
<!-- <!--
⚠️ Please make sure to read this template first, pull requests that doesn't accord with this template ⚠️ Please make sure to read this template first, pull requests that don't accord with this template
may be closed without notice. maybe closed without notice.
Texts surrounded by `<` and `>` are meant to be replaced by you, e.g. <framework name>, <issue number>. Texts surrounded by `<` and `>` are meant to be replaced by you, e.g. <framework name>, <issue number>.
Put an `x` in the `[ ]` to mark the item as CHECKED. `[x]` Put an `x` in the `[ ]` to mark the item as CHECKED. `[x]`
--> -->
...@@ -8,7 +8,7 @@ ...@@ -8,7 +8,7 @@
<!-- ==== 🐛 Remove this line WHEN AND ONLY WHEN you're fixing a bug, follow the checklist 👇 ==== <!-- ==== 🐛 Remove this line WHEN AND ONLY WHEN you're fixing a bug, follow the checklist 👇 ====
### Fix <bug description or the bug issue number or bug issue link> ### Fix <bug description or the bug issue number or bug issue link>
- [ ] Add a unit test to verify that the fix works. - [ ] Add a unit test to verify that the fix works.
- [ ] Explain briefly about why the bug exists and how to fix it. - [ ] Explain briefly why the bug exists and how to fix it.
==== 🐛 Remove this line WHEN AND ONLY WHEN you're fixing a bug, follow the checklist 👆 ==== --> ==== 🐛 Remove this line WHEN AND ONLY WHEN you're fixing a bug, follow the checklist 👆 ==== -->
<!-- ==== 🔌 Remove this line WHEN AND ONLY WHEN you're adding a new plugin, follow the checklist 👇 ==== <!-- ==== 🔌 Remove this line WHEN AND ONLY WHEN you're adding a new plugin, follow the checklist 👇 ====
......
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
name: E2E
on:
pull_request:
push:
branches:
- master
env:
SKIP_TEST: true
jobs:
Compatability:
name: Compatability
runs-on: ubuntu-latest
timeout-minutes: 30
env:
SW_SIMPLE_CASE: compat
SW_AGENT_JDK_VERSION: 11
SW_OAP_BASE_IMAGE: adoptopenjdk/openjdk11:alpine
steps:
- uses: actions/checkout@v2
with:
submodules: true
- name: Cache local Maven repository
uses: actions/cache@v2
with:
path: ~/.m2/repository
key: ${{ runner.os }}-maven-${{ hashFiles('**/pom.xml') }}
restore-keys: |
${{ runner.os }}-maven-
- name: Set Up Java
uses: actions/setup-java@v1
with:
java-version: 11
- name: Build Docker Image
run: make docker
- name: Copy dist package
run: cp -R dist test/e2e/
- name: Compatability Test (8.3.0 Agent)
run: ./mvnw --batch-mode -f test/e2e/pom.xml -am -DfailIfNoTests=false verify -Dit.test=org.apache.skywalking.e2e.simple.SimpleE2E
- uses: actions/upload-artifact@v1
if: failure()
with:
name: logs
path: logs
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
name: E2E-NodeJS
on:
pull_request:
paths:
- '**'
- '!**.md'
push:
branches:
- master
env:
SKIP_TEST: true
SW_AGENT_JDK_VERSION: 8
jobs:
NodeJSAgent:
name: NodeJSAgent
runs-on: ubuntu-latest
timeout-minutes: 30
steps:
- uses: actions/checkout@v2
with:
submodules: true
- name: Cache local Maven repository
uses: actions/cache@v2
with:
path: ~/.m2/repository
key: ${{ runner.os }}-maven-${{ hashFiles('**/pom.xml') }}
restore-keys: |
${{ runner.os }}-maven-
- name: Compile and Build
run: make docker
- name: Copy dist package
run: cp -R dist test/e2e/
- name: NodeJS Agent
run: ./mvnw --batch-mode -f test/e2e/pom.xml -am -DfailIfNoTests=false verify -Dit.test=org.apache.skywalking.e2e.NodeJSE2E
...@@ -7,12 +7,21 @@ Release Notes. ...@@ -7,12 +7,21 @@ Release Notes.
#### Project #### Project
* Incompatible with previous releases when use H2/MySQL/TiDB storage options, due to support multiple alarm rules triggered for one entity. * Incompatible with previous releases when use H2/MySQL/TiDB storage options, due to support multiple alarm rules triggered for one entity.
* Chore: adapt `create_source_release.sh` to make it runnable on Linux. * Chore: adapt `create_source_release.sh` to make it runnable on Linux.
* Add `package` to `.proto` files, prevent polluting top-level namespace in some languages; The OAP server supports previous agent releases, whereas the previous OAP server (<=8.3.0) won't recognize newer agents since this version (>= 8.4.0).
#### Java Agent #### Java Agent
* The operation name of quartz-scheduler plugin, has been changed as the `quartz-scheduler/${className}` format. * The operation name of quartz-scheduler plugin, has been changed as the `quartz-scheduler/${className}` format.
* Fix jdk-http and okhttp-3.x plugin did not overwrite the old trace header. * Fix jdk-http and okhttp-3.x plugin did not overwrite the old trace header.
* Add interceptors of method(analyze, searchScroll, clearScroll, searchTemplate and deleteByQuery) for elasticsearch-6.x-plugin.
* Support collecting logs of log4j, log4j2, and logback in the tracing context with a new `logger-plugin`. * Support collecting logs of log4j, log4j2, and logback in the tracing context with a new `logger-plugin`.
* Fix the unexpected RunningContext recreation in the Tomcat plugin. * Fix the unexpected RunningContext recreation in the Tomcat plugin.
* Fix the potential NPE when trace_sql_parameters is enabled.
* Update `byte-buddy` to 1.10.19.
* Fix thrift plugin trace link broken when intermediate service does not mount agent
* Fix thrift plugin collects wrong args when the method without parameter.
* Fix DataCarrier's `org.apache.skywalking.apm.commons.datacarrier.buffer.Buffer` implementation isn't activated in `IF_POSSIBLE` mode.
* Fix ArrayBlockingQueueBuffer's useless `IF_POSSIBLE` mode list
* Support building gRPC TLS channel but CA file is not required.
#### OAP-Backend #### OAP-Backend
* Make meter receiver support MAL. * Make meter receiver support MAL.
...@@ -20,9 +29,15 @@ Release Notes. ...@@ -20,9 +29,15 @@ Release Notes.
* Support Kafka MirrorMaker 2.0 to replicate topics between Kafka clusters. * Support Kafka MirrorMaker 2.0 to replicate topics between Kafka clusters.
* Add the rule name field to alarm record storage entity as a part of ID, to support multiple alarm rules triggered for one entity. The scope id has been removed from the ID. * Add the rule name field to alarm record storage entity as a part of ID, to support multiple alarm rules triggered for one entity. The scope id has been removed from the ID.
* Fix MAL concurrent execution issues. * Fix MAL concurrent execution issues.
* Fix group name can't be query in the GraphQL. * Fix group name can't be queried in the GraphQL.
* Fix potential gRPC connection leak(not closed) for the channels among OAP instances. * Fix potential gRPC connection leak(not closed) for the channels among OAP instances.
* Filter OAP instances(unassigned in booting stage) of the empty IP in KubernetesCoordinator. * Filter OAP instances(unassigned in booting stage) of the empty IP in KubernetesCoordinator.
* Add component ID for Python aiohttp plugin requester and server.
* Fix H2 in-memory database table missing issues
* Add component ID for Python pyramid plugin server.
* Add component ID for NodeJS Axios plugin.
* Fix searchService method error in storage-influxdb-plugin.
* Add JavaScript component ID.
#### UI #### UI
* Fix un-removed tags in trace query. * Fix un-removed tags in trace query.
...@@ -35,11 +50,16 @@ Release Notes. ...@@ -35,11 +50,16 @@ Release Notes.
* Refactor dashboard query in a common script. * Refactor dashboard query in a common script.
* Implement refreshing data for topology by updating date. * Implement refreshing data for topology by updating date.
* Implement group selector in the topology. * Implement group selector in the topology.
* Fix all as default parameter for services selector.
* Add icon for Python aiohttp plugin.
* Add icon for Python pyramid plugin.
* Fix topology render all services nodes when groups changed.
#### Documentation #### Documentation
* Update the documents of backend fetcher and self observability about the latest configurations. * Update the documents of backend fetcher and self observability about the latest configurations.
* Add documents about the group name of service. * Add documents about the group name of service.
* Update docs about the latest UI. * Update docs about the latest UI.
* Update the document of backend trace sampling with the latest configuration.
All issues and pull requests are [here](https://github.com/apache/skywalking/milestone/68?closed=1) All issues and pull requests are [here](https://github.com/apache/skywalking/milestone/68?closed=1)
......
...@@ -54,22 +54,22 @@ for better performance. Read [the paper of STAM](https://wu-sheng.github.io/STAM ...@@ -54,22 +54,22 @@ for better performance. Read [the paper of STAM](https://wu-sheng.github.io/STAM
NOTICE, SkyWalking 8.0+ uses [v3 protocols](docs/en/protocols/README.md). They are incompatible with previous releases. NOTICE, SkyWalking 8.0+ uses [v3 protocols](docs/en/protocols/README.md). They are incompatible with previous releases.
# Downloads # Downloads
Please head to the [releases page](http://skywalking.apache.org/downloads/) to download a release of Apache SkyWalking. Please head to the [releases page](https://skywalking.apache.org/downloads/) to download a release of Apache SkyWalking.
# Compiling project
Follow this [document](docs/en/guides/How-to-build.md).
# Code of conduct # Code of conduct
This project adheres to the Contributor Covenant [code of conduct](CODE_OF_CONDUCT.md). By participating, you are expected to uphold this code. This project adheres to the Contributor Covenant [code of conduct](CODE_OF_CONDUCT.md). By participating, you are expected to uphold this code.
Please follow the [REPORTING GUIDELINES](CODE_OF_CONDUCT.md#reporting-guidelines) to report unacceptable behavior. Please follow the [REPORTING GUIDELINES](CODE_OF_CONDUCT.md#reporting-guidelines) to report unacceptable behavior.
# Live Demo # Live Demo
Find the demo and screenshots on our [homepage](https://skywalking.apache.org/). Find the [demo](https://skywalking.apache.org/#demo) and [screenshots](https://skywalking.apache.org/#arch) on our website.
**Video on youtube.com** **Video on youtube.com**
[![RocketBot UI](http://img.youtube.com/vi/mfKaToAKl7k/0.jpg)](http://www.youtube.com/watch?v=mfKaToAKl7k) [![RocketBot UI](http://img.youtube.com/vi/mfKaToAKl7k/0.jpg)](http://www.youtube.com/watch?v=mfKaToAKl7k)
# Compiling project
Follow this [document](docs/en/guides/How-to-build.md).
# Contact Us # Contact Us
* Mail list: **dev@skywalking.apache.org**. Mail to `dev-subscribe@skywalking.apache.org`, follow the reply to subscribe the mail list. * Mail list: **dev@skywalking.apache.org**. Mail to `dev-subscribe@skywalking.apache.org`, follow the reply to subscribe the mail list.
* Join `skywalking` channel at [Apache Slack](http://s.apache.org/slack-invite). If the link is not working, find the latest one at [Apache INFRA WIKI](https://cwiki.apache.org/confluence/display/INFRA/Slack+Guest+Invites). * Join `skywalking` channel at [Apache Slack](http://s.apache.org/slack-invite). If the link is not working, find the latest one at [Apache INFRA WIKI](https://cwiki.apache.org/confluence/display/INFRA/Slack+Guest+Invites).
...@@ -77,13 +77,10 @@ Follow this [document](docs/en/guides/How-to-build.md). ...@@ -77,13 +77,10 @@ Follow this [document](docs/en/guides/How-to-build.md).
* QQ Group: 901167865(Recommended), 392443393 * QQ Group: 901167865(Recommended), 392443393
* [bilibili B站 视频](https://space.bilibili.com/390683219) * [bilibili B站 视频](https://space.bilibili.com/390683219)
# Who Uses SkyWalking? # Our Users
Hundreds of companies and organizations use SkyWalking for research, production, and commercial product. Hundreds of companies and organizations use SkyWalking for research, production, and commercial product.
Visit our [website](http://skywalking.apache.org/#users) to find the user page.
<img src="http://skywalking.apache.org/assets/users-20200726.png"/>
The [PoweredBy](docs/powered-by.md) page includes more users of the project.
Users are encouraged to add themselves to there.
# Landscapes # Landscapes
......
...@@ -44,10 +44,18 @@ public class DataCarrier<T> { ...@@ -44,10 +44,18 @@ public class DataCarrier<T> {
} }
public DataCarrier(String name, String envPrefix, int channelSize, int bufferSize) { public DataCarrier(String name, String envPrefix, int channelSize, int bufferSize) {
this(name, envPrefix, channelSize, bufferSize, BufferStrategy.BLOCKING);
}
public DataCarrier(String name, String envPrefix, int channelSize, int bufferSize, BufferStrategy strategy) {
this.name = name; this.name = name;
bufferSize = EnvUtil.getInt(envPrefix + "_BUFFER_SIZE", bufferSize); bufferSize = EnvUtil.getInt(envPrefix + "_BUFFER_SIZE", bufferSize);
channelSize = EnvUtil.getInt(envPrefix + "_CHANNEL_SIZE", channelSize); channelSize = EnvUtil.getInt(envPrefix + "_CHANNEL_SIZE", channelSize);
channels = new Channels<>(channelSize, bufferSize, new SimpleRollingPartitioner<T>(), BufferStrategy.BLOCKING); channels = new Channels<>(channelSize, bufferSize, new SimpleRollingPartitioner<T>(), strategy);
}
public DataCarrier(int channelSize, int bufferSize, BufferStrategy strategy) {
this("DEFAULT", "DEFAULT", channelSize, bufferSize, strategy);
} }
/** /**
...@@ -62,14 +70,6 @@ public class DataCarrier<T> { ...@@ -62,14 +70,6 @@ public class DataCarrier<T> {
return this; return this;
} }
/**
* override the strategy at runtime. Notice, {@link Channels} will override several channels one by one.
*/
public DataCarrier setBufferStrategy(BufferStrategy strategy) {
this.channels.setStrategy(strategy);
return this;
}
/** /**
* produce data to buffer, using the given {@link BufferStrategy}. * produce data to buffer, using the given {@link BufferStrategy}.
* *
......
...@@ -40,16 +40,12 @@ public class ArrayBlockingQueueBuffer<T> implements QueueBuffer<T> { ...@@ -40,16 +40,12 @@ public class ArrayBlockingQueueBuffer<T> implements QueueBuffer<T> {
@Override @Override
public boolean save(T data) { public boolean save(T data) {
switch (strategy) { //only BufferStrategy.BLOCKING
case IF_POSSIBLE: try {
return queue.offer(data); queue.put(data);
default: } catch (InterruptedException e) {
try { // Ignore the error
queue.put(data); return false;
} catch (InterruptedException e) {
// Ignore the error
return false;
}
} }
return true; return true;
} }
......
...@@ -33,7 +33,7 @@ import org.powermock.api.support.membermodification.MemberModifier; ...@@ -33,7 +33,7 @@ import org.powermock.api.support.membermodification.MemberModifier;
public class DataCarrierTest { public class DataCarrierTest {
@Test @Test
public void testCreateDataCarrier() throws IllegalAccessException { public void testCreateDataCarrier() throws IllegalAccessException {
DataCarrier<SampleData> carrier = new DataCarrier<>(5, 100); DataCarrier<SampleData> carrier = new DataCarrier<>(5, 100, BufferStrategy.IF_POSSIBLE);
Channels<SampleData> channels = (Channels<SampleData>) (MemberModifier.field(DataCarrier.class, "channels") Channels<SampleData> channels = (Channels<SampleData>) (MemberModifier.field(DataCarrier.class, "channels")
.get(carrier)); .get(carrier));
...@@ -42,8 +42,7 @@ public class DataCarrierTest { ...@@ -42,8 +42,7 @@ public class DataCarrierTest {
QueueBuffer<SampleData> buffer = channels.getBuffer(0); QueueBuffer<SampleData> buffer = channels.getBuffer(0);
Assert.assertEquals(100, buffer.getBufferSize()); Assert.assertEquals(100, buffer.getBufferSize());
Assert.assertEquals(MemberModifier.field(buffer.getClass(), "strategy").get(buffer), BufferStrategy.BLOCKING); Assert.assertEquals(MemberModifier.field(buffer.getClass(), "strategy").get(buffer), BufferStrategy.IF_POSSIBLE);
carrier.setBufferStrategy(BufferStrategy.IF_POSSIBLE);
Assert.assertEquals(MemberModifier.field(buffer.getClass(), "strategy") Assert.assertEquals(MemberModifier.field(buffer.getClass(), "strategy")
.get(buffer), BufferStrategy.IF_POSSIBLE); .get(buffer), BufferStrategy.IF_POSSIBLE);
...@@ -81,8 +80,7 @@ public class DataCarrierTest { ...@@ -81,8 +80,7 @@ public class DataCarrierTest {
@Test @Test
public void testIfPossibleProduce() throws IllegalAccessException { public void testIfPossibleProduce() throws IllegalAccessException {
DataCarrier<SampleData> carrier = new DataCarrier<SampleData>(2, 100); DataCarrier<SampleData> carrier = new DataCarrier<SampleData>(2, 100, BufferStrategy.IF_POSSIBLE);
carrier.setBufferStrategy(BufferStrategy.IF_POSSIBLE);
for (int i = 0; i < 200; i++) { for (int i = 0; i < 200; i++) {
Assert.assertTrue(carrier.produce(new SampleData().setName("d" + i))); Assert.assertTrue(carrier.produce(new SampleData().setName("d" + i)));
......
Subproject commit 754995b9bd4e64c34e970df3c3011efc9181c2a3 Subproject commit 8c10f757a9088fef06d6d8b986b8a0650b7fa106
...@@ -124,6 +124,11 @@ public class Config { ...@@ -124,6 +124,11 @@ public class Config {
* Keep tracing even the backend is not available. * Keep tracing even the backend is not available.
*/ */
public static boolean KEEP_TRACING = false; public static boolean KEEP_TRACING = false;
/**
* Force open TLS for gRPC channel if true.
*/
public static boolean FORCE_TLS = false;
} }
public static class OsInfo { public static class OsInfo {
......
...@@ -21,7 +21,7 @@ package org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance; ...@@ -21,7 +21,7 @@ package org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.StaticMethodsInterceptPoint; import org.apache.skywalking.apm.agent.core.plugin.interceptor.StaticMethodsInterceptPoint;
/** /**
* Plugins, which only need enhance class static methods. Actually, inherit from {@link * Plugins, which only need enhance class instance methods. Actually, inherit from {@link
* ClassInstanceMethodsEnhancePluginDefine} has no differences with inherit from {@link ClassEnhancePluginDefine}. Just * ClassInstanceMethodsEnhancePluginDefine} has no differences with inherit from {@link ClassEnhancePluginDefine}. Just
* override {@link ClassEnhancePluginDefine#getStaticMethodsInterceptPoints}, and return NULL, which means nothing to * override {@link ClassEnhancePluginDefine#getStaticMethodsInterceptPoints}, and return NULL, which means nothing to
* enhance. * enhance.
......
...@@ -26,6 +26,7 @@ import java.io.File; ...@@ -26,6 +26,7 @@ import java.io.File;
import javax.net.ssl.SSLException; import javax.net.ssl.SSLException;
import org.apache.skywalking.apm.agent.core.boot.AgentPackageNotFoundException; import org.apache.skywalking.apm.agent.core.boot.AgentPackageNotFoundException;
import org.apache.skywalking.apm.agent.core.boot.AgentPackagePath; import org.apache.skywalking.apm.agent.core.boot.AgentPackagePath;
import org.apache.skywalking.apm.agent.core.conf.Config;
import org.apache.skywalking.apm.agent.core.conf.Constants; import org.apache.skywalking.apm.agent.core.conf.Constants;
/** /**
...@@ -38,9 +39,12 @@ public class TLSChannelBuilder implements ChannelBuilder<NettyChannelBuilder> { ...@@ -38,9 +39,12 @@ public class TLSChannelBuilder implements ChannelBuilder<NettyChannelBuilder> {
public NettyChannelBuilder build( public NettyChannelBuilder build(
NettyChannelBuilder managedChannelBuilder) throws AgentPackageNotFoundException, SSLException { NettyChannelBuilder managedChannelBuilder) throws AgentPackageNotFoundException, SSLException {
File caFile = new File(AgentPackagePath.getPath(), CA_FILE_NAME); File caFile = new File(AgentPackagePath.getPath(), CA_FILE_NAME);
if (caFile.exists() && caFile.isFile()) { boolean isCAFileExist = caFile.exists() && caFile.isFile();
if (Config.Agent.FORCE_TLS || isCAFileExist) {
SslContextBuilder builder = GrpcSslContexts.forClient(); SslContextBuilder builder = GrpcSslContexts.forClient();
builder.trustManager(caFile); if (isCAFileExist) {
builder.trustManager(caFile);
}
managedChannelBuilder = managedChannelBuilder.negotiationType(NegotiationType.TLS) managedChannelBuilder = managedChannelBuilder.negotiationType(NegotiationType.TLS)
.sslContext(builder.build()); .sslContext(builder.build());
} }
......
...@@ -64,8 +64,7 @@ public class TraceSegmentServiceClient implements BootService, IConsumer<TraceSe ...@@ -64,8 +64,7 @@ public class TraceSegmentServiceClient implements BootService, IConsumer<TraceSe
lastLogTime = System.currentTimeMillis(); lastLogTime = System.currentTimeMillis();
segmentUplinkedCounter = 0; segmentUplinkedCounter = 0;
segmentAbandonedCounter = 0; segmentAbandonedCounter = 0;
carrier = new DataCarrier<>(CHANNEL_SIZE, BUFFER_SIZE); carrier = new DataCarrier<>(CHANNEL_SIZE, BUFFER_SIZE, BufferStrategy.IF_POSSIBLE);
carrier.setBufferStrategy(BufferStrategy.IF_POSSIBLE);
carrier.consume(this, 1); carrier.consume(this, 1);
} }
......
...@@ -80,6 +80,22 @@ public class IndicesClientInstrumentation extends ClassEnhancePluginDefine { ...@@ -80,6 +80,22 @@ public class IndicesClientInstrumentation extends ClassEnhancePluginDefine {
return Constants.INDICES_CLIENT_DELETE_METHODS_INTERCEPTOR; return Constants.INDICES_CLIENT_DELETE_METHODS_INTERCEPTOR;
} }
@Override
public boolean isOverrideArgs() {
return false;
}
},
new InstanceMethodsInterceptPoint() {
@Override
public ElementMatcher<MethodDescription> getMethodsMatcher() {
return named("analyze").or(named("analyzeAsync"));
}
@Override
public String getMethodsInterceptor() {
return Constants.INDICES_CLIENT_ANALYZE_METHODS_INTERCEPTOR;
}
@Override @Override
public boolean isOverrideArgs() { public boolean isOverrideArgs() {
return false; return false;
......
...@@ -18,11 +18,6 @@ ...@@ -18,11 +18,6 @@
package org.apache.skywalking.apm.plugin.elasticsearch.v6.define; package org.apache.skywalking.apm.plugin.elasticsearch.v6.define;
import static net.bytebuddy.matcher.ElementMatchers.named;
import static net.bytebuddy.matcher.ElementMatchers.takesArguments;
import static org.apache.skywalking.apm.agent.core.plugin.bytebuddy.ArgumentTypeNameMatch.takesArgumentWithType;
import static org.apache.skywalking.apm.agent.core.plugin.match.NameMatch.byName;
import net.bytebuddy.description.method.MethodDescription; import net.bytebuddy.description.method.MethodDescription;
import net.bytebuddy.matcher.ElementMatcher; import net.bytebuddy.matcher.ElementMatcher;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.ConstructorInterceptPoint; import org.apache.skywalking.apm.agent.core.plugin.interceptor.ConstructorInterceptPoint;
...@@ -32,6 +27,11 @@ import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.ClassEnha ...@@ -32,6 +27,11 @@ import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.ClassEnha
import org.apache.skywalking.apm.agent.core.plugin.match.ClassMatch; import org.apache.skywalking.apm.agent.core.plugin.match.ClassMatch;
import org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.Constants; import org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.Constants;
import static net.bytebuddy.matcher.ElementMatchers.named;
import static net.bytebuddy.matcher.ElementMatchers.takesArguments;
import static org.apache.skywalking.apm.agent.core.plugin.bytebuddy.ArgumentTypeNameMatch.takesArgumentWithType;
import static org.apache.skywalking.apm.agent.core.plugin.match.NameMatch.byName;
/** /**
* {@link RestHighLevelClientInstrumentation} enhance the constructor method without argument in * {@link RestHighLevelClientInstrumentation} enhance the constructor method without argument in
* <code>org.elasticsearch.client.RestHighLevelClient</code> by <code>org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.RestHighLevelClientConInterceptor</code> * <code>org.elasticsearch.client.RestHighLevelClient</code> by <code>org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.RestHighLevelClientConInterceptor</code>
...@@ -181,6 +181,70 @@ public class RestHighLevelClientInstrumentation extends ClassEnhancePluginDefine ...@@ -181,6 +181,70 @@ public class RestHighLevelClientInstrumentation extends ClassEnhancePluginDefine
return Constants.REST_HIGH_LEVEL_CLIENT_CLUSTER_METHODS_INTERCEPTOR; return Constants.REST_HIGH_LEVEL_CLIENT_CLUSTER_METHODS_INTERCEPTOR;
} }
@Override
public boolean isOverrideArgs() {
return false;
}
},
new InstanceMethodsInterceptPoint() {
@Override
public ElementMatcher<MethodDescription> getMethodsMatcher() {
return named("scroll").or(named("scrollAsync"));
}
@Override
public String getMethodsInterceptor() {
return Constants.REST_HIGH_LEVEL_CLIENT_SEARCH_SCROLL_METHODS_INTERCEPTOR;
}
@Override
public boolean isOverrideArgs() {
return false;
}
},
new InstanceMethodsInterceptPoint() {
@Override
public ElementMatcher<MethodDescription> getMethodsMatcher() {
return named("searchTemplate").or(named("searchTemplateAsync"));
}
@Override
public String getMethodsInterceptor() {
return Constants.REST_HIGH_LEVEL_CLIENT_SEARCH_TEMPLATE_METHODS_INTERCEPTOR;
}
@Override
public boolean isOverrideArgs() {
return false;
}
},
new InstanceMethodsInterceptPoint() {
@Override
public ElementMatcher<MethodDescription> getMethodsMatcher() {
return named("clearScroll").or(named("clearScrollAsync"));
}
@Override
public String getMethodsInterceptor() {
return Constants.REST_HIGH_LEVEL_CLIENT_CLEAR_SCROLL_METHODS_INTERCEPTOR;
}
@Override
public boolean isOverrideArgs() {
return false;
}
},
new InstanceMethodsInterceptPoint() {
@Override
public ElementMatcher<MethodDescription> getMethodsMatcher() {
return named("deleteByQuery").or(named("deleteByQueryAsync"));
}
@Override
public String getMethodsInterceptor() {
return Constants.REST_HIGH_LEVEL_CLIENT_DELETE_BY_QUERY_METHODS_INTERCEPTOR;
}
@Override @Override
public boolean isOverrideArgs() { public boolean isOverrideArgs() {
return false; return false;
......
...@@ -26,6 +26,11 @@ public class Constants { ...@@ -26,6 +26,11 @@ public class Constants {
public static final String REST_HIGH_LEVEL_CLIENT_CON_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.RestHighLevelClientConInterceptor"; public static final String REST_HIGH_LEVEL_CLIENT_CON_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.RestHighLevelClientConInterceptor";
public static final String INDICES_CLIENT_CREATE_METHODS_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.IndicesClientCreateMethodsInterceptor"; public static final String INDICES_CLIENT_CREATE_METHODS_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.IndicesClientCreateMethodsInterceptor";
public static final String INDICES_CLIENT_DELETE_METHODS_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.IndicesClientDeleteMethodsInterceptor"; public static final String INDICES_CLIENT_DELETE_METHODS_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.IndicesClientDeleteMethodsInterceptor";
public static final String INDICES_CLIENT_ANALYZE_METHODS_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.IndicesClientAnalyzeMethodsInterceptor";
public static final String REST_HIGH_LEVEL_CLIENT_SEARCH_SCROLL_METHODS_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.RestHighLevelClientSearchScrollMethodsInterceptor";
public static final String REST_HIGH_LEVEL_CLIENT_SEARCH_TEMPLATE_METHODS_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.RestHighLevelClientSearchTemplateMethodsInterceptor";
public static final String REST_HIGH_LEVEL_CLIENT_CLEAR_SCROLL_METHODS_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.RestHighLevelClientClearScrollMethodsInterceptor";
public static final String REST_HIGH_LEVEL_CLIENT_DELETE_BY_QUERY_METHODS_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.RestHighLevelClientDeleteByQueryMethodsInterceptor";
public static final String REST_HIGH_LEVEL_CLIENT_GET_METHODS_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.RestHighLevelClientGetMethodsInterceptor"; public static final String REST_HIGH_LEVEL_CLIENT_GET_METHODS_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.RestHighLevelClientGetMethodsInterceptor";
public static final String REST_HIGH_LEVEL_CLIENT_SEARCH_METHODS_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.RestHighLevelClientSearchMethodsInterceptor"; public static final String REST_HIGH_LEVEL_CLIENT_SEARCH_METHODS_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.RestHighLevelClientSearchMethodsInterceptor";
public static final String REST_HIGH_LEVEL_CLIENT_UPDATE_METHODS_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.RestHighLevelClientUpdateMethodsInterceptor"; public static final String REST_HIGH_LEVEL_CLIENT_UPDATE_METHODS_INTERCEPTOR = "org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.RestHighLevelClientUpdateMethodsInterceptor";
...@@ -42,10 +47,15 @@ public class Constants { ...@@ -42,10 +47,15 @@ public class Constants {
//es operator name //es operator name
public static final String CREATE_OPERATOR_NAME = "Elasticsearch/CreateRequest"; public static final String CREATE_OPERATOR_NAME = "Elasticsearch/CreateRequest";
public static final String DELETE_OPERATOR_NAME = "Elasticsearch/DeleteRequest"; public static final String DELETE_OPERATOR_NAME = "Elasticsearch/DeleteRequest";
public static final String ANALYZE_OPERATOR_NAME = "Elasticsearch/AnalyzeRequest";
public static final String GET_OPERATOR_NAME = "Elasticsearch/GetRequest"; public static final String GET_OPERATOR_NAME = "Elasticsearch/GetRequest";
public static final String INDEX_OPERATOR_NAME = "Elasticsearch/IndexRequest"; public static final String INDEX_OPERATOR_NAME = "Elasticsearch/IndexRequest";
public static final String SEARCH_OPERATOR_NAME = "Elasticsearch/SearchRequest"; public static final String SEARCH_OPERATOR_NAME = "Elasticsearch/SearchRequest";
public static final String UPDATE_OPERATOR_NAME = "Elasticsearch/UpdateRequest"; public static final String UPDATE_OPERATOR_NAME = "Elasticsearch/UpdateRequest";
public static final String SEARCH_SCROLL_OPERATOR_NAME = "Elasticsearch/SearchScrollRequest";
public static final String SEARCH_TEMPLATE_OPERATOR_NAME = "Elasticsearch/SearchTemplateRequest";
public static final String CLEAR_SCROLL_OPERATOR_NAME = "Elasticsearch/ClearScrollRequest";
public static final String DELETE_BY_QUERY_OPERATOR_NAME = "Elasticsearch/DeleteByQueryRequest";
public static final String CLUSTER_HEALTH_NAME = "Elasticsearch/Health"; public static final String CLUSTER_HEALTH_NAME = "Elasticsearch/Health";
public static final String CLUSTER_GET_SETTINGS_NAME = "Elasticsearch/GetSettings"; public static final String CLUSTER_GET_SETTINGS_NAME = "Elasticsearch/GetSettings";
public static final String CLUSTER_PUT_SETTINGS_NAME = "Elasticsearch/PutSettings"; public static final String CLUSTER_PUT_SETTINGS_NAME = "Elasticsearch/PutSettings";
......
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor;
import org.apache.skywalking.apm.agent.core.context.ContextManager;
import org.apache.skywalking.apm.agent.core.context.tag.Tags;
import org.apache.skywalking.apm.agent.core.context.trace.AbstractSpan;
import org.apache.skywalking.apm.agent.core.context.trace.SpanLayer;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.EnhancedInstance;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.InstanceMethodsAroundInterceptor;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.MethodInterceptResult;
import org.apache.skywalking.apm.network.trace.component.ComponentsDefine;
import org.apache.skywalking.apm.plugin.elasticsearch.v6.RestClientEnhanceInfo;
import org.elasticsearch.action.admin.indices.analyze.AnalyzeRequest;
import java.lang.reflect.Method;
import static org.apache.skywalking.apm.plugin.elasticsearch.v6.ElasticsearchPluginConfig.Plugin.Elasticsearch.TRACE_DSL;
import static org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.Constants.DB_TYPE;
public class IndicesClientAnalyzeMethodsInterceptor implements InstanceMethodsAroundInterceptor {
@Override
public void beforeMethod(EnhancedInstance objInst, Method method, Object[] allArguments, Class<?>[] argumentsTypes,
MethodInterceptResult result) throws Throwable {
AnalyzeRequest analyzeRequest = (AnalyzeRequest) allArguments[0];
RestClientEnhanceInfo restClientEnhanceInfo = (RestClientEnhanceInfo) objInst.getSkyWalkingDynamicField();
if (restClientEnhanceInfo != null) {
AbstractSpan span = ContextManager.createExitSpan(Constants.ANALYZE_OPERATOR_NAME, restClientEnhanceInfo.getPeers());
span.setComponent(ComponentsDefine.REST_HIGH_LEVEL_CLIENT);
Tags.DB_TYPE.set(span, DB_TYPE);
span.tag(Tags.ofKey("analyzer"), analyzeRequest.analyzer());
if (TRACE_DSL) {
Tags.DB_STATEMENT.set(span, analyzeRequest.text()[0]);
}
SpanLayer.asDB(span);
}
}
@Override
public Object afterMethod(EnhancedInstance objInst, Method method, Object[] allArguments, Class<?>[] argumentsTypes,
Object ret) throws Throwable {
RestClientEnhanceInfo restClientEnhanceInfo = (RestClientEnhanceInfo) objInst.getSkyWalkingDynamicField();
if (restClientEnhanceInfo != null) {
ContextManager.stopSpan();
}
return ret;
}
@Override
public void handleMethodException(EnhancedInstance objInst, Method method, Object[] allArguments,
Class<?>[] argumentsTypes, Throwable t) {
RestClientEnhanceInfo restClientEnhanceInfo = (RestClientEnhanceInfo) objInst.getSkyWalkingDynamicField();
if (restClientEnhanceInfo != null) {
ContextManager.activeSpan().log(t);
}
}
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor;
import org.apache.skywalking.apm.agent.core.context.ContextManager;
import org.apache.skywalking.apm.agent.core.context.tag.Tags;
import org.apache.skywalking.apm.agent.core.context.trace.AbstractSpan;
import org.apache.skywalking.apm.agent.core.context.trace.SpanLayer;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.EnhancedInstance;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.InstanceMethodsAroundInterceptor;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.MethodInterceptResult;
import org.apache.skywalking.apm.network.trace.component.ComponentsDefine;
import org.apache.skywalking.apm.plugin.elasticsearch.v6.RestClientEnhanceInfo;
import org.elasticsearch.action.search.ClearScrollRequest;
import java.lang.reflect.Method;
import java.util.stream.Collectors;
import static org.apache.skywalking.apm.plugin.elasticsearch.v6.ElasticsearchPluginConfig.Plugin.Elasticsearch.TRACE_DSL;
import static org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.Constants.DB_TYPE;
public class RestHighLevelClientClearScrollMethodsInterceptor implements InstanceMethodsAroundInterceptor {
@Override
public void beforeMethod(EnhancedInstance objInst, Method method, Object[] allArguments, Class<?>[] argumentsTypes,
MethodInterceptResult result) throws Throwable {
ClearScrollRequest searchScrollRequest = (ClearScrollRequest) allArguments[0];
RestClientEnhanceInfo restClientEnhanceInfo = (RestClientEnhanceInfo) objInst.getSkyWalkingDynamicField();
AbstractSpan span = ContextManager.createExitSpan(Constants.CLEAR_SCROLL_OPERATOR_NAME, restClientEnhanceInfo.getPeers());
span.setComponent(ComponentsDefine.REST_HIGH_LEVEL_CLIENT);
Tags.DB_TYPE.set(span, DB_TYPE);
if (TRACE_DSL) {
Tags.DB_STATEMENT.set(span, searchScrollRequest.scrollIds().stream().collect(Collectors.joining(",")));
}
SpanLayer.asDB(span);
}
@Override
public Object afterMethod(EnhancedInstance objInst, Method method, Object[] allArguments, Class<?>[] argumentsTypes,
Object ret) throws Throwable {
ContextManager.stopSpan();
return ret;
}
@Override
public void handleMethodException(EnhancedInstance objInst, Method method, Object[] allArguments,
Class<?>[] argumentsTypes, Throwable t) {
ContextManager.activeSpan().log(t);
}
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor;
import org.apache.skywalking.apm.agent.core.context.ContextManager;
import org.apache.skywalking.apm.agent.core.context.tag.Tags;
import org.apache.skywalking.apm.agent.core.context.trace.AbstractSpan;
import org.apache.skywalking.apm.agent.core.context.trace.SpanLayer;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.EnhancedInstance;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.InstanceMethodsAroundInterceptor;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.MethodInterceptResult;
import org.apache.skywalking.apm.network.trace.component.ComponentsDefine;
import org.apache.skywalking.apm.plugin.elasticsearch.v6.RestClientEnhanceInfo;
import org.elasticsearch.index.reindex.DeleteByQueryRequest;
import java.lang.reflect.Method;
import java.util.Arrays;
import static org.apache.skywalking.apm.plugin.elasticsearch.v6.ElasticsearchPluginConfig.Plugin.Elasticsearch.TRACE_DSL;
import static org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.Constants.DB_TYPE;
public class RestHighLevelClientDeleteByQueryMethodsInterceptor implements InstanceMethodsAroundInterceptor {
@Override
public void beforeMethod(EnhancedInstance objInst, Method method, Object[] allArguments, Class<?>[] argumentsTypes,
MethodInterceptResult result) throws Throwable {
DeleteByQueryRequest deleteByQueryRequest = (DeleteByQueryRequest) allArguments[0];
RestClientEnhanceInfo restClientEnhanceInfo = (RestClientEnhanceInfo) objInst.getSkyWalkingDynamicField();
AbstractSpan span = ContextManager.createExitSpan(Constants.DELETE_BY_QUERY_OPERATOR_NAME, restClientEnhanceInfo.getPeers());
span.setComponent(ComponentsDefine.REST_HIGH_LEVEL_CLIENT);
Tags.DB_TYPE.set(span, DB_TYPE);
Tags.DB_INSTANCE.set(span, Arrays.asList(deleteByQueryRequest.indices()).toString());
if (TRACE_DSL) {
if (deleteByQueryRequest.getSearchRequest() != null) {
Tags.DB_STATEMENT.set(span, deleteByQueryRequest.getSearchRequest().toString());
} else {
Tags.DB_STATEMENT.set(span, deleteByQueryRequest.toString());
}
}
SpanLayer.asDB(span);
}
@Override
public Object afterMethod(EnhancedInstance objInst, Method method, Object[] allArguments, Class<?>[] argumentsTypes,
Object ret) throws Throwable {
ContextManager.stopSpan();
return ret;
}
@Override
public void handleMethodException(EnhancedInstance objInst, Method method, Object[] allArguments,
Class<?>[] argumentsTypes, Throwable t) {
ContextManager.activeSpan().log(t);
}
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor;
import org.apache.skywalking.apm.agent.core.context.ContextManager;
import org.apache.skywalking.apm.agent.core.context.tag.Tags;
import org.apache.skywalking.apm.agent.core.context.trace.AbstractSpan;
import org.apache.skywalking.apm.agent.core.context.trace.SpanLayer;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.EnhancedInstance;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.InstanceMethodsAroundInterceptor;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.MethodInterceptResult;
import org.apache.skywalking.apm.network.trace.component.ComponentsDefine;
import org.apache.skywalking.apm.plugin.elasticsearch.v6.RestClientEnhanceInfo;
import org.elasticsearch.action.search.SearchScrollRequest;
import java.lang.reflect.Method;
import static org.apache.skywalking.apm.plugin.elasticsearch.v6.ElasticsearchPluginConfig.Plugin.Elasticsearch.TRACE_DSL;
import static org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.Constants.DB_TYPE;
public class RestHighLevelClientSearchScrollMethodsInterceptor implements InstanceMethodsAroundInterceptor {
@Override
public void beforeMethod(EnhancedInstance objInst, Method method, Object[] allArguments, Class<?>[] argumentsTypes,
MethodInterceptResult result) throws Throwable {
SearchScrollRequest searchScrollRequest = (SearchScrollRequest) allArguments[0];
RestClientEnhanceInfo restClientEnhanceInfo = (RestClientEnhanceInfo) objInst.getSkyWalkingDynamicField();
AbstractSpan span = ContextManager.createExitSpan(Constants.SEARCH_SCROLL_OPERATOR_NAME, restClientEnhanceInfo.getPeers());
span.setComponent(ComponentsDefine.REST_HIGH_LEVEL_CLIENT);
Tags.DB_TYPE.set(span, DB_TYPE);
if (TRACE_DSL) {
Tags.DB_STATEMENT.set(span, searchScrollRequest.toString());
}
SpanLayer.asDB(span);
}
@Override
public Object afterMethod(EnhancedInstance objInst, Method method, Object[] allArguments, Class<?>[] argumentsTypes,
Object ret) throws Throwable {
ContextManager.stopSpan();
return ret;
}
@Override
public void handleMethodException(EnhancedInstance objInst, Method method, Object[] allArguments,
Class<?>[] argumentsTypes, Throwable t) {
ContextManager.activeSpan().log(t);
}
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor;
import org.apache.skywalking.apm.agent.core.context.ContextManager;
import org.apache.skywalking.apm.agent.core.context.tag.Tags;
import org.apache.skywalking.apm.agent.core.context.trace.AbstractSpan;
import org.apache.skywalking.apm.agent.core.context.trace.SpanLayer;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.EnhancedInstance;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.InstanceMethodsAroundInterceptor;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.MethodInterceptResult;
import org.apache.skywalking.apm.network.trace.component.ComponentsDefine;
import org.apache.skywalking.apm.plugin.elasticsearch.v6.RestClientEnhanceInfo;
import org.elasticsearch.script.mustache.SearchTemplateRequest;
import java.lang.reflect.Method;
import java.util.Arrays;
import static org.apache.skywalking.apm.plugin.elasticsearch.v6.ElasticsearchPluginConfig.Plugin.Elasticsearch.TRACE_DSL;
import static org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor.Constants.DB_TYPE;
public class RestHighLevelClientSearchTemplateMethodsInterceptor implements InstanceMethodsAroundInterceptor {
@Override
public void beforeMethod(EnhancedInstance objInst, Method method, Object[] allArguments, Class<?>[] argumentsTypes,
MethodInterceptResult result) throws Throwable {
SearchTemplateRequest searchTemplateRequest = (SearchTemplateRequest) allArguments[0];
RestClientEnhanceInfo restClientEnhanceInfo = (RestClientEnhanceInfo) objInst.getSkyWalkingDynamicField();
AbstractSpan span = ContextManager.createExitSpan(Constants.SEARCH_TEMPLATE_OPERATOR_NAME, restClientEnhanceInfo.getPeers());
span.setComponent(ComponentsDefine.REST_HIGH_LEVEL_CLIENT);
Tags.DB_TYPE.set(span, DB_TYPE);
if (searchTemplateRequest.getRequest() != null) {
Tags.DB_INSTANCE.set(span, Arrays.asList(searchTemplateRequest.getRequest().indices()).toString());
}
if (TRACE_DSL) {
Tags.DB_STATEMENT.set(span, searchTemplateRequest.getScript());
}
SpanLayer.asDB(span);
}
@Override
public Object afterMethod(EnhancedInstance objInst, Method method, Object[] allArguments, Class<?>[] argumentsTypes,
Object ret) throws Throwable {
ContextManager.stopSpan();
return ret;
}
@Override
public void handleMethodException(EnhancedInstance objInst, Method method, Object[] allArguments,
Class<?>[] argumentsTypes, Throwable t) {
ContextManager.activeSpan().log(t);
}
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor;
import org.apache.skywalking.apm.agent.core.context.trace.AbstractTracingSpan;
import org.apache.skywalking.apm.agent.core.context.trace.ExitSpan;
import org.apache.skywalking.apm.agent.core.context.trace.TraceSegment;
import org.apache.skywalking.apm.agent.core.context.util.TagValuePair;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.EnhancedInstance;
import org.apache.skywalking.apm.agent.test.helper.SegmentHelper;
import org.apache.skywalking.apm.agent.test.helper.SpanHelper;
import org.apache.skywalking.apm.agent.test.tools.AgentServiceRule;
import org.apache.skywalking.apm.agent.test.tools.SegmentStorage;
import org.apache.skywalking.apm.agent.test.tools.SegmentStoragePoint;
import org.apache.skywalking.apm.agent.test.tools.SpanAssert;
import org.apache.skywalking.apm.agent.test.tools.TracingSegmentRunner;
import org.apache.skywalking.apm.plugin.elasticsearch.v6.RestClientEnhanceInfo;
import org.elasticsearch.action.admin.indices.analyze.AnalyzeRequest;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.powermock.modules.junit4.PowerMockRunner;
import org.powermock.modules.junit4.PowerMockRunnerDelegate;
import java.util.List;
import static org.apache.skywalking.apm.plugin.elasticsearch.v6.ElasticsearchPluginConfig.Plugin.Elasticsearch.TRACE_DSL;
import static org.hamcrest.CoreMatchers.is;
import static org.junit.Assert.assertThat;
import static org.powermock.api.mockito.PowerMockito.when;
@RunWith(PowerMockRunner.class)
@PowerMockRunnerDelegate(TracingSegmentRunner.class)
public class IndicesClientAnalyzeMethodsInterceptorTest {
@SegmentStoragePoint
private SegmentStorage segmentStorage;
@Rule
public AgentServiceRule serviceRule = new AgentServiceRule();
@Mock
private EnhancedInstance enhancedInstance;
@Mock
private AnalyzeRequest analyzeRequest;
private Object[] allArguments;
@Mock
private RestClientEnhanceInfo restClientEnhanceInfo;
private IndicesClientAnalyzeMethodsInterceptor interceptor;
@Before
public void setUp() throws Exception {
when(restClientEnhanceInfo.getPeers()).thenReturn("127.0.0.1:9200");
allArguments = new Object[] {analyzeRequest};
when(analyzeRequest.analyzer()).thenReturn("analyzer");
when(analyzeRequest.text()).thenReturn(new String[] {"exampleText"});
when(enhancedInstance.getSkyWalkingDynamicField()).thenReturn(restClientEnhanceInfo);
interceptor = new IndicesClientAnalyzeMethodsInterceptor();
}
@Test
public void testMethodsAround() throws Throwable {
TRACE_DSL = true;
interceptor.beforeMethod(enhancedInstance, null, allArguments, null, null);
interceptor.afterMethod(enhancedInstance, null, allArguments, null, null);
List<TraceSegment> traceSegmentList = segmentStorage.getTraceSegments();
Assert.assertThat(traceSegmentList.size(), is(1));
TraceSegment traceSegment = traceSegmentList.get(0);
AbstractTracingSpan analyzeSpan = SegmentHelper.getSpans(traceSegment).get(0);
assertAnalyzeSpan(analyzeSpan);
}
private void assertAnalyzeSpan(AbstractTracingSpan analyzeSpan) {
assertThat(analyzeSpan instanceof ExitSpan, is(true));
ExitSpan exitSpan = (ExitSpan) analyzeSpan;
assertThat(exitSpan.getOperationName(), is("Elasticsearch/AnalyzeRequest"));
assertThat(exitSpan.getPeer(), is("127.0.0.1:9200"));
assertThat(SpanHelper.getComponentId(exitSpan), is(77));
List<TagValuePair> tags = SpanHelper.getTags(exitSpan);
assertThat(tags.size(), is(3));
assertThat(tags.get(0).getValue(), is("Elasticsearch"));
assertThat(tags.get(1).getValue(), is("analyzer"));
assertThat(tags.get(2).getValue(), is("exampleText"));
}
@Test
public void testMethodsAroundError() throws Throwable {
TRACE_DSL = true;
interceptor.beforeMethod(enhancedInstance, null, allArguments, null, null);
interceptor.handleMethodException(enhancedInstance, null, allArguments, null, new RuntimeException());
interceptor.afterMethod(enhancedInstance, null, allArguments, null, null);
List<TraceSegment> traceSegmentList = segmentStorage.getTraceSegments();
Assert.assertThat(traceSegmentList.size(), is(1));
TraceSegment traceSegment = traceSegmentList.get(0);
AbstractTracingSpan analyzeSpan = SegmentHelper.getSpans(traceSegment).get(0);
assertAnalyzeSpan(analyzeSpan);
Assert.assertEquals(true, SpanHelper.getErrorOccurred(analyzeSpan));
SpanAssert.assertException(SpanHelper.getLogs(analyzeSpan).get(0), RuntimeException.class);
}
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor;
import org.apache.skywalking.apm.agent.core.context.trace.AbstractTracingSpan;
import org.apache.skywalking.apm.agent.core.context.trace.ExitSpan;
import org.apache.skywalking.apm.agent.core.context.trace.TraceSegment;
import org.apache.skywalking.apm.agent.core.context.util.TagValuePair;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.EnhancedInstance;
import org.apache.skywalking.apm.agent.test.helper.SegmentHelper;
import org.apache.skywalking.apm.agent.test.helper.SpanHelper;
import org.apache.skywalking.apm.agent.test.tools.AgentServiceRule;
import org.apache.skywalking.apm.agent.test.tools.SegmentStorage;
import org.apache.skywalking.apm.agent.test.tools.SegmentStoragePoint;
import org.apache.skywalking.apm.agent.test.tools.SpanAssert;
import org.apache.skywalking.apm.agent.test.tools.TracingSegmentRunner;
import org.apache.skywalking.apm.plugin.elasticsearch.v6.RestClientEnhanceInfo;
import org.elasticsearch.action.search.ClearScrollRequest;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.powermock.modules.junit4.PowerMockRunner;
import org.powermock.modules.junit4.PowerMockRunnerDelegate;
import java.util.ArrayList;
import java.util.List;
import static org.apache.skywalking.apm.plugin.elasticsearch.v6.ElasticsearchPluginConfig.Plugin.Elasticsearch.TRACE_DSL;
import static org.hamcrest.CoreMatchers.is;
import static org.junit.Assert.assertThat;
import static org.powermock.api.mockito.PowerMockito.when;
@RunWith(PowerMockRunner.class)
@PowerMockRunnerDelegate(TracingSegmentRunner.class)
public class RestHighLevelClientClearScrollMethodsInterceptorTest {
@SegmentStoragePoint
private SegmentStorage segmentStorage;
@Rule
public AgentServiceRule serviceRule = new AgentServiceRule();
@Mock
private EnhancedInstance enhancedInstance;
@Mock
private ClearScrollRequest clearScrollRequest;
private Object[] allArguments;
@Mock
private RestClientEnhanceInfo restClientEnhanceInfo;
private RestHighLevelClientClearScrollMethodsInterceptor interceptor;
@Before
public void setUp() throws Exception {
when(restClientEnhanceInfo.getPeers()).thenReturn("127.0.0.1:9200");
allArguments = new Object[] {clearScrollRequest};
List<String> testList = new ArrayList<>(1);
testList.add("testScrollId");
when(clearScrollRequest.scrollIds()).thenReturn(testList);
when(enhancedInstance.getSkyWalkingDynamicField()).thenReturn(restClientEnhanceInfo);
interceptor = new RestHighLevelClientClearScrollMethodsInterceptor();
}
@Test
public void testMethodsAround() throws Throwable {
TRACE_DSL = true;
interceptor.beforeMethod(enhancedInstance, null, allArguments, null, null);
interceptor.afterMethod(enhancedInstance, null, allArguments, null, null);
List<TraceSegment> traceSegmentList = segmentStorage.getTraceSegments();
Assert.assertThat(traceSegmentList.size(), is(1));
TraceSegment traceSegment = traceSegmentList.get(0);
AbstractTracingSpan clearScrollSpan = SegmentHelper.getSpans(traceSegment).get(0);
assertClearScrollSpan(clearScrollSpan);
}
private void assertClearScrollSpan(AbstractTracingSpan clearScrollSpan) {
assertThat(clearScrollSpan instanceof ExitSpan, is(true));
ExitSpan exitSpan = (ExitSpan) clearScrollSpan;
assertThat(exitSpan.getOperationName(), is("Elasticsearch/ClearScrollRequest"));
assertThat(exitSpan.getPeer(), is("127.0.0.1:9200"));
assertThat(SpanHelper.getComponentId(exitSpan), is(77));
List<TagValuePair> tags = SpanHelper.getTags(exitSpan);
assertThat(tags.size(), is(2));
assertThat(tags.get(0).getValue(), is("Elasticsearch"));
assertThat(tags.get(1).getValue(), is("testScrollId"));
}
@Test
public void testMethodsAroundError() throws Throwable {
TRACE_DSL = true;
interceptor.beforeMethod(enhancedInstance, null, allArguments, null, null);
interceptor.handleMethodException(enhancedInstance, null, allArguments, null, new RuntimeException());
interceptor.afterMethod(enhancedInstance, null, allArguments, null, null);
List<TraceSegment> traceSegmentList = segmentStorage.getTraceSegments();
Assert.assertThat(traceSegmentList.size(), is(1));
TraceSegment traceSegment = traceSegmentList.get(0);
AbstractTracingSpan clearScrollSpan = SegmentHelper.getSpans(traceSegment).get(0);
assertClearScrollSpan(clearScrollSpan);
Assert.assertEquals(true, SpanHelper.getErrorOccurred(clearScrollSpan));
SpanAssert.assertException(SpanHelper.getLogs(clearScrollSpan).get(0), RuntimeException.class);
}
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor;
import org.apache.skywalking.apm.agent.core.context.trace.AbstractTracingSpan;
import org.apache.skywalking.apm.agent.core.context.trace.ExitSpan;
import org.apache.skywalking.apm.agent.core.context.trace.TraceSegment;
import org.apache.skywalking.apm.agent.core.context.util.TagValuePair;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.EnhancedInstance;
import org.apache.skywalking.apm.agent.test.helper.SegmentHelper;
import org.apache.skywalking.apm.agent.test.helper.SpanHelper;
import org.apache.skywalking.apm.agent.test.tools.AgentServiceRule;
import org.apache.skywalking.apm.agent.test.tools.SegmentStorage;
import org.apache.skywalking.apm.agent.test.tools.SegmentStoragePoint;
import org.apache.skywalking.apm.agent.test.tools.SpanAssert;
import org.apache.skywalking.apm.agent.test.tools.TracingSegmentRunner;
import org.apache.skywalking.apm.plugin.elasticsearch.v6.RestClientEnhanceInfo;
import org.elasticsearch.index.reindex.DeleteByQueryRequest;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.powermock.modules.junit4.PowerMockRunner;
import org.powermock.modules.junit4.PowerMockRunnerDelegate;
import java.util.List;
import static org.apache.skywalking.apm.plugin.elasticsearch.v6.ElasticsearchPluginConfig.Plugin.Elasticsearch.TRACE_DSL;
import static org.hamcrest.CoreMatchers.is;
import static org.junit.Assert.assertThat;
import static org.powermock.api.mockito.PowerMockito.when;
@RunWith(PowerMockRunner.class)
@PowerMockRunnerDelegate(TracingSegmentRunner.class)
public class RestHighLevelClientDeleteByQueryMethodsInterceptorTest {
@SegmentStoragePoint
private SegmentStorage segmentStorage;
@Rule
public AgentServiceRule serviceRule = new AgentServiceRule();
@Mock
private EnhancedInstance enhancedInstance;
@Mock
private DeleteByQueryRequest deleteByQueryRequest;
private Object[] allArguments;
@Mock
private RestClientEnhanceInfo restClientEnhanceInfo;
private RestHighLevelClientDeleteByQueryMethodsInterceptor interceptor;
@Before
public void setUp() throws Exception {
when(restClientEnhanceInfo.getPeers()).thenReturn("127.0.0.1:9200");
allArguments = new Object[] {deleteByQueryRequest};
when(deleteByQueryRequest.indices()).thenReturn(new String[] {"indexName"});
when(deleteByQueryRequest.toString()).thenReturn("deleteByQueryRequest");
when(enhancedInstance.getSkyWalkingDynamicField()).thenReturn(restClientEnhanceInfo);
interceptor = new RestHighLevelClientDeleteByQueryMethodsInterceptor();
}
@Test
public void testMethodsAround() throws Throwable {
TRACE_DSL = true;
interceptor.beforeMethod(enhancedInstance, null, allArguments, null, null);
interceptor.afterMethod(enhancedInstance, null, allArguments, null, null);
List<TraceSegment> traceSegmentList = segmentStorage.getTraceSegments();
Assert.assertThat(traceSegmentList.size(), is(1));
TraceSegment traceSegment = traceSegmentList.get(0);
AbstractTracingSpan deleteByQuerySpan = SegmentHelper.getSpans(traceSegment).get(0);
assertDeleteByQuerySpan(deleteByQuerySpan);
}
private void assertDeleteByQuerySpan(AbstractTracingSpan deleteByQuerySpan) {
assertThat(deleteByQuerySpan instanceof ExitSpan, is(true));
ExitSpan exitSpan = (ExitSpan) deleteByQuerySpan;
assertThat(exitSpan.getOperationName(), is("Elasticsearch/DeleteByQueryRequest"));
assertThat(exitSpan.getPeer(), is("127.0.0.1:9200"));
assertThat(SpanHelper.getComponentId(exitSpan), is(77));
List<TagValuePair> tags = SpanHelper.getTags(exitSpan);
assertThat(tags.size(), is(3));
assertThat(tags.get(0).getValue(), is("Elasticsearch"));
assertThat(tags.get(1).getValue(), is("[indexName]"));
assertThat(tags.get(2).getValue(), is("deleteByQueryRequest"));
}
@Test
public void testMethodsAroundError() throws Throwable {
TRACE_DSL = true;
interceptor.beforeMethod(enhancedInstance, null, allArguments, null, null);
interceptor.handleMethodException(enhancedInstance, null, allArguments, null, new RuntimeException());
interceptor.afterMethod(enhancedInstance, null, allArguments, null, null);
List<TraceSegment> traceSegmentList = segmentStorage.getTraceSegments();
Assert.assertThat(traceSegmentList.size(), is(1));
TraceSegment traceSegment = traceSegmentList.get(0);
AbstractTracingSpan deleteByQuerySpan = SegmentHelper.getSpans(traceSegment).get(0);
assertDeleteByQuerySpan(deleteByQuerySpan);
Assert.assertEquals(true, SpanHelper.getErrorOccurred(deleteByQuerySpan));
SpanAssert.assertException(SpanHelper.getLogs(deleteByQuerySpan).get(0), RuntimeException.class);
}
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor;
import org.apache.skywalking.apm.agent.core.context.trace.AbstractTracingSpan;
import org.apache.skywalking.apm.agent.core.context.trace.ExitSpan;
import org.apache.skywalking.apm.agent.core.context.trace.TraceSegment;
import org.apache.skywalking.apm.agent.core.context.util.TagValuePair;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.EnhancedInstance;
import org.apache.skywalking.apm.agent.test.helper.SegmentHelper;
import org.apache.skywalking.apm.agent.test.helper.SpanHelper;
import org.apache.skywalking.apm.agent.test.tools.AgentServiceRule;
import org.apache.skywalking.apm.agent.test.tools.SegmentStorage;
import org.apache.skywalking.apm.agent.test.tools.SegmentStoragePoint;
import org.apache.skywalking.apm.agent.test.tools.SpanAssert;
import org.apache.skywalking.apm.agent.test.tools.TracingSegmentRunner;
import org.apache.skywalking.apm.plugin.elasticsearch.v6.RestClientEnhanceInfo;
import org.elasticsearch.action.search.SearchScrollRequest;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.powermock.modules.junit4.PowerMockRunner;
import org.powermock.modules.junit4.PowerMockRunnerDelegate;
import java.util.List;
import static org.apache.skywalking.apm.plugin.elasticsearch.v6.ElasticsearchPluginConfig.Plugin.Elasticsearch.TRACE_DSL;
import static org.hamcrest.CoreMatchers.is;
import static org.junit.Assert.assertThat;
import static org.powermock.api.mockito.PowerMockito.when;
@RunWith(PowerMockRunner.class)
@PowerMockRunnerDelegate(TracingSegmentRunner.class)
public class RestHighLevelClientSearchScrollMethodsInterceptorTest {
@SegmentStoragePoint
private SegmentStorage segmentStorage;
@Rule
public AgentServiceRule serviceRule = new AgentServiceRule();
@Mock
private EnhancedInstance enhancedInstance;
@Mock
private SearchScrollRequest searchScrollRequest;
private Object[] allArguments;
@Mock
private RestClientEnhanceInfo restClientEnhanceInfo;
private RestHighLevelClientSearchScrollMethodsInterceptor interceptor;
@Before
public void setUp() throws Exception {
when(restClientEnhanceInfo.getPeers()).thenReturn("127.0.0.1:9200");
allArguments = new Object[] {searchScrollRequest};
when(searchScrollRequest.toString()).thenReturn("searchScrollRequest");
when(enhancedInstance.getSkyWalkingDynamicField()).thenReturn(restClientEnhanceInfo);
interceptor = new RestHighLevelClientSearchScrollMethodsInterceptor();
}
@Test
public void testMethodsAround() throws Throwable {
TRACE_DSL = true;
interceptor.beforeMethod(enhancedInstance, null, allArguments, null, null);
interceptor.afterMethod(enhancedInstance, null, allArguments, null, null);
List<TraceSegment> traceSegmentList = segmentStorage.getTraceSegments();
Assert.assertThat(traceSegmentList.size(), is(1));
TraceSegment traceSegment = traceSegmentList.get(0);
AbstractTracingSpan searchScrollSpan = SegmentHelper.getSpans(traceSegment).get(0);
assertSearchScrollSpan(searchScrollSpan);
}
private void assertSearchScrollSpan(AbstractTracingSpan searchScrollSpan) {
assertThat(searchScrollSpan instanceof ExitSpan, is(true));
ExitSpan exitSpan = (ExitSpan) searchScrollSpan;
assertThat(exitSpan.getOperationName(), is("Elasticsearch/SearchScrollRequest"));
assertThat(exitSpan.getPeer(), is("127.0.0.1:9200"));
assertThat(SpanHelper.getComponentId(exitSpan), is(77));
List<TagValuePair> tags = SpanHelper.getTags(exitSpan);
assertThat(tags.size(), is(2));
assertThat(tags.get(0).getValue(), is("Elasticsearch"));
assertThat(tags.get(1).getValue(), is("searchScrollRequest"));
}
@Test
public void testMethodsAroundError() throws Throwable {
TRACE_DSL = true;
interceptor.beforeMethod(enhancedInstance, null, allArguments, null, null);
interceptor.handleMethodException(enhancedInstance, null, allArguments, null, new RuntimeException());
interceptor.afterMethod(enhancedInstance, null, allArguments, null, null);
List<TraceSegment> traceSegmentList = segmentStorage.getTraceSegments();
Assert.assertThat(traceSegmentList.size(), is(1));
TraceSegment traceSegment = traceSegmentList.get(0);
AbstractTracingSpan searchScrollSpan = SegmentHelper.getSpans(traceSegment).get(0);
assertSearchScrollSpan(searchScrollSpan);
Assert.assertEquals(true, SpanHelper.getErrorOccurred(searchScrollSpan));
SpanAssert.assertException(SpanHelper.getLogs(searchScrollSpan).get(0), RuntimeException.class);
}
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.skywalking.apm.plugin.elasticsearch.v6.interceptor;
import org.apache.skywalking.apm.agent.core.context.trace.AbstractTracingSpan;
import org.apache.skywalking.apm.agent.core.context.trace.ExitSpan;
import org.apache.skywalking.apm.agent.core.context.trace.TraceSegment;
import org.apache.skywalking.apm.agent.core.context.util.TagValuePair;
import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.EnhancedInstance;
import org.apache.skywalking.apm.agent.test.helper.SegmentHelper;
import org.apache.skywalking.apm.agent.test.helper.SpanHelper;
import org.apache.skywalking.apm.agent.test.tools.AgentServiceRule;
import org.apache.skywalking.apm.agent.test.tools.SegmentStorage;
import org.apache.skywalking.apm.agent.test.tools.SegmentStoragePoint;
import org.apache.skywalking.apm.agent.test.tools.SpanAssert;
import org.apache.skywalking.apm.agent.test.tools.TracingSegmentRunner;
import org.apache.skywalking.apm.plugin.elasticsearch.v6.RestClientEnhanceInfo;
import org.elasticsearch.script.mustache.SearchTemplateRequest;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.powermock.modules.junit4.PowerMockRunner;
import org.powermock.modules.junit4.PowerMockRunnerDelegate;
import java.util.List;
import static org.apache.skywalking.apm.plugin.elasticsearch.v6.ElasticsearchPluginConfig.Plugin.Elasticsearch.TRACE_DSL;
import static org.hamcrest.CoreMatchers.is;
import static org.junit.Assert.assertThat;
import static org.powermock.api.mockito.PowerMockito.when;
@RunWith(PowerMockRunner.class)
@PowerMockRunnerDelegate(TracingSegmentRunner.class)
public class RestHighLevelClientSearchTemplateMethodsInterceptorTest {
@SegmentStoragePoint
private SegmentStorage segmentStorage;
@Rule
public AgentServiceRule serviceRule = new AgentServiceRule();
@Mock
private EnhancedInstance enhancedInstance;
@Mock
private SearchTemplateRequest searchTemplateRequest;
private Object[] allArguments;
@Mock
private RestClientEnhanceInfo restClientEnhanceInfo;
private RestHighLevelClientSearchTemplateMethodsInterceptor interceptor;
@Before
public void setUp() throws Exception {
when(restClientEnhanceInfo.getPeers()).thenReturn("127.0.0.1:9200");
allArguments = new Object[] {searchTemplateRequest};
when(searchTemplateRequest.getScript()).thenReturn("searchTemplateRequest");
when(enhancedInstance.getSkyWalkingDynamicField()).thenReturn(restClientEnhanceInfo);
interceptor = new RestHighLevelClientSearchTemplateMethodsInterceptor();
}
@Test
public void testMethodsAround() throws Throwable {
TRACE_DSL = true;
interceptor.beforeMethod(enhancedInstance, null, allArguments, null, null);
interceptor.afterMethod(enhancedInstance, null, allArguments, null, null);
List<TraceSegment> traceSegmentList = segmentStorage.getTraceSegments();
Assert.assertThat(traceSegmentList.size(), is(1));
TraceSegment traceSegment = traceSegmentList.get(0);
AbstractTracingSpan searchTemplateSpan = SegmentHelper.getSpans(traceSegment).get(0);
assertSearchTemplateSpan(searchTemplateSpan);
}
private void assertSearchTemplateSpan(AbstractTracingSpan searchTemplateSpan) {
assertThat(searchTemplateSpan instanceof ExitSpan, is(true));
ExitSpan exitSpan = (ExitSpan) searchTemplateSpan;
assertThat(exitSpan.getOperationName(), is("Elasticsearch/SearchTemplateRequest"));
assertThat(exitSpan.getPeer(), is("127.0.0.1:9200"));
assertThat(SpanHelper.getComponentId(exitSpan), is(77));
List<TagValuePair> tags = SpanHelper.getTags(exitSpan);
assertThat(tags.size(), is(2));
assertThat(tags.get(0).getValue(), is("Elasticsearch"));
assertThat(tags.get(1).getValue(), is("searchTemplateRequest"));
}
@Test
public void testMethodsAroundError() throws Throwable {
TRACE_DSL = true;
interceptor.beforeMethod(enhancedInstance, null, allArguments, null, null);
interceptor.handleMethodException(enhancedInstance, null, allArguments, null, new RuntimeException());
interceptor.afterMethod(enhancedInstance, null, allArguments, null, null);
List<TraceSegment> traceSegmentList = segmentStorage.getTraceSegments();
Assert.assertThat(traceSegmentList.size(), is(1));
TraceSegment traceSegment = traceSegmentList.get(0);
AbstractTracingSpan searchTemplateSpan = SegmentHelper.getSpans(traceSegment).get(0);
assertSearchTemplateSpan(searchTemplateSpan);
Assert.assertEquals(true, SpanHelper.getErrorOccurred(searchTemplateSpan));
SpanAssert.assertException(SpanHelper.getLogs(searchTemplateSpan).get(0), RuntimeException.class);
}
}
...@@ -31,8 +31,10 @@ public class JDBCPreparedStatementIgnorableSetterInterceptor implements Instance ...@@ -31,8 +31,10 @@ public class JDBCPreparedStatementIgnorableSetterInterceptor implements Instance
public final void beforeMethod(EnhancedInstance objInst, Method method, Object[] allArguments, public final void beforeMethod(EnhancedInstance objInst, Method method, Object[] allArguments,
Class<?>[] argumentsTypes, MethodInterceptResult result) throws Throwable { Class<?>[] argumentsTypes, MethodInterceptResult result) throws Throwable {
final StatementEnhanceInfos statementEnhanceInfos = (StatementEnhanceInfos) objInst.getSkyWalkingDynamicField(); final StatementEnhanceInfos statementEnhanceInfos = (StatementEnhanceInfos) objInst.getSkyWalkingDynamicField();
final int index = (Integer) allArguments[0]; if (statementEnhanceInfos != null) {
statementEnhanceInfos.setParameter(index, Constants.SQL_PARAMETER_PLACEHOLDER); final int index = (Integer) allArguments[0];
statementEnhanceInfos.setParameter(index, Constants.SQL_PARAMETER_PLACEHOLDER);
}
} }
@Override @Override
......
...@@ -30,8 +30,10 @@ public class JDBCPreparedStatementNullSetterInterceptor implements InstanceMetho ...@@ -30,8 +30,10 @@ public class JDBCPreparedStatementNullSetterInterceptor implements InstanceMetho
public final void beforeMethod(EnhancedInstance objInst, Method method, Object[] allArguments, public final void beforeMethod(EnhancedInstance objInst, Method method, Object[] allArguments,
Class<?>[] argumentsTypes, MethodInterceptResult result) throws Throwable { Class<?>[] argumentsTypes, MethodInterceptResult result) throws Throwable {
final StatementEnhanceInfos statementEnhanceInfos = (StatementEnhanceInfos) objInst.getSkyWalkingDynamicField(); final StatementEnhanceInfos statementEnhanceInfos = (StatementEnhanceInfos) objInst.getSkyWalkingDynamicField();
final int index = (Integer) allArguments[0]; if (statementEnhanceInfos != null) {
statementEnhanceInfos.setParameter(index, "NULL"); final int index = (Integer) allArguments[0];
statementEnhanceInfos.setParameter(index, "NULL");
}
} }
@Override @Override
......
...@@ -30,9 +30,11 @@ public class JDBCPreparedStatementSetterInterceptor implements InstanceMethodsAr ...@@ -30,9 +30,11 @@ public class JDBCPreparedStatementSetterInterceptor implements InstanceMethodsAr
public final void beforeMethod(EnhancedInstance objInst, Method method, Object[] allArguments, public final void beforeMethod(EnhancedInstance objInst, Method method, Object[] allArguments,
Class<?>[] argumentsTypes, MethodInterceptResult result) throws Throwable { Class<?>[] argumentsTypes, MethodInterceptResult result) throws Throwable {
final StatementEnhanceInfos statementEnhanceInfos = (StatementEnhanceInfos) objInst.getSkyWalkingDynamicField(); final StatementEnhanceInfos statementEnhanceInfos = (StatementEnhanceInfos) objInst.getSkyWalkingDynamicField();
final int index = (Integer) allArguments[0]; if (statementEnhanceInfos != null) {
final Object parameter = allArguments[1]; final int index = (Integer) allArguments[0];
statementEnhanceInfos.setParameter(index, parameter); final Object parameter = allArguments[1];
statementEnhanceInfos.setParameter(index, parameter);
}
} }
@Override @Override
......
...@@ -39,7 +39,7 @@ import static org.apache.skywalking.apm.agent.core.plugin.match.MethodInheritanc ...@@ -39,7 +39,7 @@ import static org.apache.skywalking.apm.agent.core.plugin.match.MethodInheritanc
* <code>ControllerConstructorInterceptor</code> set the controller base path to * <code>ControllerConstructorInterceptor</code> set the controller base path to
* dynamic field before execute constructor. * dynamic field before execute constructor.
* *
* <code>org.apache.skywalking.apm.plugin.spring.mvc.v4.RequestMappingMethodInterceptor</code> get the request path * <code>org.apache.skywalking.apm.plugin.spring.mvc.commons.interceptor.RequestMappingMethodInterceptor</code> get the request path
* from dynamic field first, if not found, <code>RequestMappingMethodInterceptor</code> generate request path that * from dynamic field first, if not found, <code>RequestMappingMethodInterceptor</code> generate request path that
* combine the path value of current annotation on current method and the base path and set the new path to the dynamic * combine the path value of current annotation on current method and the base path and set the new path to the dynamic
* filed * filed
......
...@@ -38,7 +38,7 @@ import static org.apache.skywalking.apm.agent.core.plugin.match.MethodInheritanc ...@@ -38,7 +38,7 @@ import static org.apache.skywalking.apm.agent.core.plugin.match.MethodInheritanc
* <code>ControllerConstructorInterceptor</code> set the controller base path to * <code>ControllerConstructorInterceptor</code> set the controller base path to
* dynamic field before execute constructor. * dynamic field before execute constructor.
* *
* <code>org.apache.skywalking.apm.plugin.spring.mvc.v4.RequestMappingMethodInterceptor</code> get the request path * <code>org.apache.skywalking.apm.plugin.spring.mvc.commons.interceptor.RequestMappingMethodInterceptor</code> get the request path
* from dynamic field first, if not found, <code>RequestMappingMethodInterceptor</code> generate request path that * from dynamic field first, if not found, <code>RequestMappingMethodInterceptor</code> generate request path that
* combine the path value of current annotation on current method and the base path and set the new path to the dynamic * combine the path value of current annotation on current method and the base path and set the new path to the dynamic
* filed * filed
......
...@@ -100,6 +100,7 @@ public class TServiceClientInterceptor implements InstanceConstructorInterceptor ...@@ -100,6 +100,7 @@ public class TServiceClientInterceptor implements InstanceConstructorInterceptor
while (true) { while (true) {
TFieldIdEnum field = base.fieldForId(++idx); TFieldIdEnum field = base.fieldForId(++idx);
if (field == null) { if (field == null) {
idx--;
break; break;
} }
buffer.append(field.getFieldName()).append(", "); buffer.append(field.getFieldName()).append(", ");
......
...@@ -21,6 +21,7 @@ package org.apache.skywalking.apm.plugin.thrift.wrapper; ...@@ -21,6 +21,7 @@ package org.apache.skywalking.apm.plugin.thrift.wrapper;
import java.util.HashMap; import java.util.HashMap;
import java.util.Map; import java.util.Map;
import java.util.Objects; import java.util.Objects;
import org.apache.skywalking.apm.agent.core.context.CarrierItem; import org.apache.skywalking.apm.agent.core.context.CarrierItem;
import org.apache.skywalking.apm.agent.core.context.ContextCarrier; import org.apache.skywalking.apm.agent.core.context.ContextCarrier;
import org.apache.skywalking.apm.agent.core.context.ContextManager; import org.apache.skywalking.apm.agent.core.context.ContextManager;
...@@ -45,6 +46,7 @@ public class ServerInProtocolWrapper extends AbstractProtocolWrapper { ...@@ -45,6 +46,7 @@ public class ServerInProtocolWrapper extends AbstractProtocolWrapper {
private static final ILog LOGGER = LogManager.getLogger(ServerInProtocolWrapper.class); private static final ILog LOGGER = LogManager.getLogger(ServerInProtocolWrapper.class);
private static final StringTag TAG_ARGS = new StringTag("args"); private static final StringTag TAG_ARGS = new StringTag("args");
private AbstractContext context; private AbstractContext context;
private static final String HAVE_CREATED_SPAN = "HAVE_CREATED_SPAN";
public ServerInProtocolWrapper(final TProtocol protocol) { public ServerInProtocolWrapper(final TProtocol protocol) {
super(protocol); super(protocol);
...@@ -52,6 +54,7 @@ public class ServerInProtocolWrapper extends AbstractProtocolWrapper { ...@@ -52,6 +54,7 @@ public class ServerInProtocolWrapper extends AbstractProtocolWrapper {
public void initial(AbstractContext context) { public void initial(AbstractContext context) {
this.context = context; this.context = context;
ContextManager.getRuntimeContext().put(HAVE_CREATED_SPAN, false);
} }
@Override @Override
...@@ -72,6 +75,7 @@ public class ServerInProtocolWrapper extends AbstractProtocolWrapper { ...@@ -72,6 +75,7 @@ public class ServerInProtocolWrapper extends AbstractProtocolWrapper {
span.tag(TAG_ARGS, context.getArguments()); span.tag(TAG_ARGS, context.getArguments());
span.setComponent(ComponentsDefine.THRIFT_SERVER); span.setComponent(ComponentsDefine.THRIFT_SERVER);
SpanLayer.asRPCFramework(span); SpanLayer.asRPCFramework(span);
ContextManager.getRuntimeContext().put(HAVE_CREATED_SPAN, true);
} catch (Throwable throwable) { } catch (Throwable throwable) {
LOGGER.error("Failed to resolve header or create EntrySpan.", throwable); LOGGER.error("Failed to resolve header or create EntrySpan.", throwable);
} finally { } finally {
...@@ -81,6 +85,24 @@ public class ServerInProtocolWrapper extends AbstractProtocolWrapper { ...@@ -81,6 +85,24 @@ public class ServerInProtocolWrapper extends AbstractProtocolWrapper {
} }
return readFieldBegin(); return readFieldBegin();
} }
if (field.type == TType.STOP) {
Boolean haveCreatedSpan =
(Boolean) ContextManager.getRuntimeContext().get(HAVE_CREATED_SPAN);
if (haveCreatedSpan != null && !haveCreatedSpan) {
try {
AbstractSpan span = ContextManager.createEntrySpan(
context.getOperatorName(), createContextCarrier(null));
span.start(context.startTime);
span.tag(TAG_ARGS, context.getArguments());
span.setComponent(ComponentsDefine.THRIFT_SERVER);
SpanLayer.asRPCFramework(span);
} catch (Throwable throwable) {
LOGGER.error("Failed to create EntrySpan.", throwable);
} finally {
context = null;
}
}
}
return field; return field;
} }
......
...@@ -51,6 +51,10 @@ agent.service_name=${SW_AGENT_NAME:Your_ApplicationName} ...@@ -51,6 +51,10 @@ agent.service_name=${SW_AGENT_NAME:Your_ApplicationName}
# Notice, in the current practice, we don't recommend the length over 190. # Notice, in the current practice, we don't recommend the length over 190.
# agent.operation_name_threshold=${SW_AGENT_OPERATION_NAME_THRESHOLD:150} # agent.operation_name_threshold=${SW_AGENT_OPERATION_NAME_THRESHOLD:150}
# The agent use gRPC plain text in default.
# If true, SkyWalking agent uses TLS even no CA file detected.
# agent.force_tls=${SW_AGENT_FORCE_TLS:false}
# If true, skywalking agent will enable profile when user create a new profile task. Otherwise disable profile. # If true, skywalking agent will enable profile when user create a new profile task. Otherwise disable profile.
# profile.active=${SW_AGENT_PROFILE_ACTIVE:true} # profile.active=${SW_AGENT_PROFILE_ACTIVE:true}
......
...@@ -59,8 +59,7 @@ public class KafkaTraceSegmentServiceClient implements BootService, IConsumer<Tr ...@@ -59,8 +59,7 @@ public class KafkaTraceSegmentServiceClient implements BootService, IConsumer<Tr
@Override @Override
public void boot() { public void boot() {
carrier = new DataCarrier<>(CHANNEL_SIZE, BUFFER_SIZE); carrier = new DataCarrier<>(CHANNEL_SIZE, BUFFER_SIZE, BufferStrategy.IF_POSSIBLE);
carrier.setBufferStrategy(BufferStrategy.IF_POSSIBLE);
carrier.consume(this, 1); carrier.consume(this, 1);
producer = ServiceManager.INSTANCE.findService(KafkaProducerManager.class).getProducer(); producer = ServiceManager.INSTANCE.findService(KafkaProducerManager.class).getProducer();
......
...@@ -225,8 +225,8 @@ Apache 2.0 licenses ...@@ -225,8 +225,8 @@ Apache 2.0 licenses
The following components are provided under the Apache License. See project link for details. The following components are provided under the Apache License. See project link for details.
The text of each license is the standard Apache 2.0 license. The text of each license is the standard Apache 2.0 license.
raphw (byte-buddy) 1.10.14: http://bytebuddy.net/ , Apache 2.0 raphw (byte-buddy) 1.10.19: http://bytebuddy.net/ , Apache 2.0
Google: gprc-java 1.32.1: https://github.com/grpc/grpc-java, Apache 2.0 Google: grpc-java 1.32.1: https://github.com/grpc/grpc-java, Apache 2.0
Google: guava 28.1: https://github.com/google/guava , Apache 2.0 Google: guava 28.1: https://github.com/google/guava , Apache 2.0
Google: guice 4.1.0: https://github.com/google/guice , Apache 2.0 Google: guice 4.1.0: https://github.com/google/guice , Apache 2.0
Google: gson 2.8.6: https://github.com/google/gson , Apache 2.0 Google: gson 2.8.6: https://github.com/google/gson , Apache 2.0
...@@ -381,7 +381,7 @@ BSD licenses ...@@ -381,7 +381,7 @@ BSD licenses
The following components are provided under a BSD license. See project link for details. The following components are provided under a BSD license. See project link for details.
The text of each license is also included at licenses/LICENSE-[project].txt. The text of each license is also included at licenses/LICENSE-[project].txt.
asm 8.0.1:https://gitlab.ow2.org , BSD-3-Clause asm 9.0:https://gitlab.ow2.org , BSD-3-Clause
antlr4-runtime 4.5.1: http://www.antlr.org/license.html, BSD-3-Clause antlr4-runtime 4.5.1: http://www.antlr.org/license.html, BSD-3-Clause
jline 0.9.94: http://mvnrepository.com/artifact/jline/jline/0.9.94, BSD jline 0.9.94: http://mvnrepository.com/artifact/jline/jline/0.9.94, BSD
Google: protobuf-java 3.13.0: https://github.com/google/protobuf/blob/master/java/pom.xml , BSD-3-Clause Google: protobuf-java 3.13.0: https://github.com/google/protobuf/blob/master/java/pom.xml , BSD-3-Clause
......
...@@ -14,7 +14,7 @@ In **agent-analyzer** module, you will find `sampleRate` setting. ...@@ -14,7 +14,7 @@ In **agent-analyzer** module, you will find `sampleRate` setting.
agent-analyzer: agent-analyzer:
default: default:
... ...
sampleRate: ${SW_TRACE_SAMPLE_RATE:1000} # The sample rate precision is 1/10000. 10000 means 100% sample in default. sampleRate: ${SW_TRACE_SAMPLE_RATE:10000} # The sample rate precision is 1/10000. 10000 means 100% sample in default.
forceSampleErrorSegment: ${SW_FORCE_SAMPLE_ERROR_SEGMENT:true} # When sampling mechanism activated, this config would make the error status segment sampled, ignoring the sampling rate. forceSampleErrorSegment: ${SW_FORCE_SAMPLE_ERROR_SEGMENT:true} # When sampling mechanism activated, this config would make the error status segment sampled, ignoring the sampling rate.
slowTraceSegmentThreshold: ${SW_SLOW_TRACE_SEGMENT_THRESHOLD:-1} # Setting this threshold about the latency would make the slow trace segments sampled if they cost more time, even the sampling mechanism activated. The default value is `-1`, which means would not sample slow traces. Unit, millisecond. slowTraceSegmentThreshold: ${SW_SLOW_TRACE_SEGMENT_THRESHOLD:-1} # Setting this threshold about the latency would make the slow trace segments sampled if they cost more time, even the sampling mechanism activated. The default value is `-1`, which means would not sample slow traces. Unit, millisecond.
``` ```
......
...@@ -86,6 +86,7 @@ property key | Description | Default | ...@@ -86,6 +86,7 @@ property key | Description | Default |
`agent.force_reconnection_period `|Force reconnection period of grpc, based on grpc_channel_check_interval.|`1`| `agent.force_reconnection_period `|Force reconnection period of grpc, based on grpc_channel_check_interval.|`1`|
`agent.operation_name_threshold `|The operationName max length, setting this value > 190 is not recommended.|`150`| `agent.operation_name_threshold `|The operationName max length, setting this value > 190 is not recommended.|`150`|
`agent.keep_tracing`|Keep tracing even the backend is not available if this value is `true`.|`false`| `agent.keep_tracing`|Keep tracing even the backend is not available if this value is `true`.|`false`|
`agent.force_tls`|Force open TLS for gRPC channel if this value is `true`.|`false`|
`osinfo.ipv4_list_size`| Limit the length of the ipv4 list size. |`10`| `osinfo.ipv4_list_size`| Limit the length of the ipv4 list size. |`10`|
`collector.grpc_channel_check_interval`|grpc channel status check interval.|`30`| `collector.grpc_channel_check_interval`|grpc channel status check interval.|`30`|
`collector.heartbeat_period`|agent heartbeat report period. Unit, second.|`30`| `collector.heartbeat_period`|agent heartbeat report period. Unit, second.|`30`|
......
...@@ -19,6 +19,8 @@ Only support **no mutual auth**. ...@@ -19,6 +19,8 @@ Only support **no mutual auth**.
### Agent config ### Agent config
- Place `ca.crt` into `/ca` folder in agent package. Notice, `/ca` is not created in distribution, please create it by yourself. - Place `ca.crt` into `/ca` folder in agent package. Notice, `/ca` is not created in distribution, please create it by yourself.
Agent open TLS automatically after the `/ca/ca.crt` file detected. - Agent open TLS automatically after the `/ca/ca.crt` file detected.
- TLS with no CA mode could be activated by this setting.
o make sure can't access other ports out of region (VPC), such as firewall, proxy. ```
\ No newline at end of file agent.force_tls=${SW_AGENT_FORCE_TLS:false}
```
...@@ -91,7 +91,7 @@ Topology map shows the relationship among the services and instances with metric ...@@ -91,7 +91,7 @@ Topology map shows the relationship among the services and instances with metric
* Topology shows the default global topology including all services. * Topology shows the default global topology including all services.
* **Service Selector** provides 2 level selectors, service group list and service name list. The group name is separated from * **Service Selector** provides 2 level selectors, service group list and service name list. The group name is separated from
the service name if it follows <group name>::<logic name> format. Topology map is available for single group, single service, the service name if it follows `<group name>::<logic name>` format. Topology map is available for single group, single service,
or global(include all services). or global(include all services).
* **Custom Group** provides the any sub topology capability of service group. * **Custom Group** provides the any sub topology capability of service group.
* **Service Deep Dive** opens when you click any service. The honeycomb could do metrics, trace and alarm query of the selected service. * **Service Deep Dive** opens when you click any service. The honeycomb could do metrics, trace and alarm query of the selected service.
......
# Powered by Apache SkyWalking
This page documents an **alphabetical list** of institutions that are using Apache SkyWalking for research and production,
or providing commercial products including Apache SkyWalking.
1. 100tal.cn 北京世纪好未来教育科技有限公司 http://www.100tal.com/
1. 17173.com https://www.17173.com/
1. 300.cn 中企动力科技股份有限公司 http://www.300.cn/
1. 360jinrong.net 360金融 https://www.360jinrong.net/
1. 4399.com 四三九九网络股份有限公司. http://www.4399.com/
1. 51mydao.com 买道传感科技(上海)有限公司 https://www.51mydao.com/
1. 58 Daojia Inc. 58到家 https://www.daojia.com
1. 5i5j. 上海我爱我家房地产经纪有限公司 https://sh.5i5j.com/about/
1. Anheuser-Busch InBev 百威英博
1. Agricultural Bank of China 中国农业银行
1. Aihuishou.com 爱回收网 http://www.aihuishou.com/
1. Alibaba Cloud, 阿里云, http://aliyun.com
1. Anxin Insurance. 安心财产保险有限责任公司 https://www.95303.com
1. APM Star 北京天空漫步科技有限公司 http://www.apmstar.com
1. AsiaInfo Inc. http://www.asiainfo.com.cn/
1. Autohome. 汽车之家. http://www.autohome.com.cn
1. baidu 百度 https://www.baidu.com/
1. Baixing.com 百姓网 http://www.baixing.com/
1. bitauto 易车 http://bitauto.com
1. hellobanma 斑马网络 https://www.hellobanma.com/
1. bestsign. 上上签. https://www.bestsign.cn/page/
1. Beike Finance 贝壳金服 https://www.bkjk.com/
1. Bizsaas.cn 北京商云科技发展有限公司. http://www.bizsaas.cn/
1. BoCloud 苏州博纳讯动软件有限公司. http://www.bocloud.com.cn/
1. Cdlhyj.com 六合远教(成都)科技有限公司 http://www.cdlhyj.com
1. Chehejia Automotive. 北京车和家信息技术有限责任公司. https://www.chehejia.com/
1. China Eastern Airlines 中国东方航空 http://www.ceair.com/
1. China Express Airlines 华夏航空 http://www.chinaexpressair.com/
1. Chinadaas. 北京中数智汇科技股份有限公司. https://www.chinadaas.com/
1. Chinasoft International 中软国际
1. China Merchants Bank. 中国招商银行. http://www.cmbchina.com/
1. China National Software 中软
1. China Mobile 中国移动
1. China Unicom 中国联通
1. China Tower 中国铁塔
1. China Telecom 中国电信
1. Chinese Academy of Sciences
1. Chtwm.com. 恒天财富投资管理股份有限公司. https://www.chtwm.com/
1. Cmft.com. 招商局金融科技. https://www.cmft.com/
1. CXIST.com 上海程析智能科技有限公司 https://www.cxist.com/
1. Dangdang.com. 当当网. http://www.dangdang.com/
1. DaoCloud. https://www.daocloud.io/
1. deepblueai.com 深兰科技上海有限公司 https://www.deepblueai.com/
1. Deppon Logistics Co Ltd 德邦物流 https://www.deppon.com/
1. Deyoushenghuo in WeChat app. 河南有态度信息科技有限公司,微信小程序:得有生活
1. Dianfubao.com 垫富宝 https://www.dianfubao.com/
1. DiDi 滴滴出行
1. dxy.cn 丁香园 http://www.dxy.cn/
1. Byte Dance 字节跳动 https://bytedance.com
1. Echplus.com 北京易诚互动网络技术有限公司 http://www.echplus.com/
1. Enmonster 怪兽充电 http://www.enmonster.com/
1. Eqxiu.com. 北京中网易企秀科技有限公司 http://www.eqxiu.com/
1. essence.com.cn 安信证券股份有限公司 http://www.essence.com.cn/
1. fangdd.com 房多多 https://www.fangdd.com
1. fullgoal.com.cn 富国基金管理有限公司 https://www.fullgoal.com.cn/
1. GTrace System. (No company provided)
1. GSX Techedu Inc. 跟谁学 https://www.genshuixue.com
1. Gdeng.cn 深圳谷登科技有限公司 http://www.gdeng.cn/
1. GOME 国美 https://www.gome.com.cn/
1. Guazi.com 瓜子二手车直卖网. https://www.guazi.com/
1. guohuaitech.com 北京国槐信息科技有限公司. http://www.guohuaitech.com/
1. GrowingIO 北京易数科技有限公司 https://www.growingio.com/
1. Haier. 海尔消费金融 https://www.haiercash.com/
1. Haoyunhu. 上海好运虎供应链管理有限公司 http://www.haoyunhu56.com/
1. helijia.com 河狸家 http://www.helijia.com/
1. Huawei
1. Hundun YUNRONG Fintech. 杭州恒生云融网络科技有限公司 https://www.hsjry.com/
1. hunliji.com 婚礼纪 https://www.hunliji.com/
1. hydee.cn 海典软件 http://www.hydee.cn/
1. iBoxChain 盒子科技 https://www.iboxpay.com/
1. iFLYTEK. 科大讯飞股份有限公司-消费者BG http://www.iflytek.com/
1. Inspur 浪潮集团
1. iQIYI.COM. 爱奇艺 https://www.iqiyi.com/
1. juhaokan 聚好看科技股份有限公司 https://www.juhaokan.org/
1. Ke.com. 贝壳找房. https://www.ke.com
1. Keking.cn 凯京集团. http://www.keking.cn
1. KubeSphere https://kubesphere.io
1. JoinTown. 九州通医药集团 http://www.jztey.com/
1. Lagou.com. 拉勾. https://www.lagou.com/
1. laocaibao. 上海证大爱特金融信息服务有限公司 https://www.laocaibao.com/
1. Lenovo 联想
1. liaofan168.com 了凡科技 http://www.liaofan168.com
1. lianzhongyouche.com.cn 联众优车 https://www.lianzhongyouche.com.cn/
1. Lima 北京力码科技有限公司 https://www.zhongbaounion.com/
1. Lifesense.com 广东乐心医疗电子股份有限公司 http://www.lifesense.com/
1. lizhi.fm 荔枝FM https://www.lizhi.fm/
1. Lixiang.com 理想汽车 https://www.lixiang.com/
1. Madecare. 北京美德远健科技有限公司. http://www.madecare.com/
1. Maodou.com 毛豆新车网. https://www.maodou.com/
1. Mobanker.com 上海前隆信息科技有限公司 http://www.mobanker.com/
1. Mxnavi. 沈阳美行科技有限公司 http://www.mxnavi.com/
1. Moji 墨叽(深圳)科技有限公司 https://www.mojivip.com
1. Minsheng FinTech / China Minsheng Bank 民生科技有限责任公司 http://www.mskj.com/
1. Migu Digital Media Co.Ltd. 咪咕数字传媒有限公司 http://www.migu.cn/
1. Mypharma.com 北京融贯电子商务有限公司 https://www.mypharma.com
1. NetEase 网易 https://www.163.com/
1. Osacart in WeChat app 广州美克曼尼电子商务有限公司
1. Oriente. https://oriente.com/
1. Peking University 北京大学
1. Ping An Technology / Ping An Insurance 平安科技
1. Primeton.com 普元信息技术股份有限公司 http://www.primeton.com
1. qiniu.com 七牛云 http://qiniu.com
1. Qingyidai.com 轻易贷 https://www.qingyidai.com/
1. Qsdjf.com 浙江钱宝网络科技有限公司 https://www.qsdjf.com/index.html
1. Qk365.com 上海青客电子商务有限公司 https://www.qk365.com
1. Qudian 趣店 http://ir.qudian.com/
1. Renren Network 人人网
1. Rong Data. 荣数数据 http://www.rong-data.com/
1. Rongjinbao. 深圳融金宝互联网金融服务有限公司. http://www.rjb777.com
1. Safedog. 安全狗. http://www.safedog.cn/
1. servingcloud.com 盈佳云创科技(深圳)有限公司 http://www.servingcloud.com/
1. SF Express 顺丰速运 https://www.sf-express.com/
1. Shouqi Limousine & chauffeur Group 首约科技(北京)有限公司. https://www.01zhuanche.com/
1. shuaibaoshop.com 宁波鲸灵网络科技有限公司 http://www.shuaibaoshop.com/
1. shuyun.com 杭州数云信息技术有限公司 http://www.shuyun.com/
1. Sijibao.com 司机宝 https://www.sijibao.com/
1. Sina 新浪
1. Sinolink Securities Co.,Ltd. 国金证券佣金宝 http://www.yongjinbao.com.cn/
1. Source++ https://sourceplusplus.com
1. SPD Bank 浦发银行
1. StartDT 奇点云 https://www.startdt.com/
1. State Grid Corporation of China 国家电网有限公司
1. Successchannel 苏州渠成易销网络科技有限公司. http://www.successchannel.com
1. SuperMap 北京超图软件
1. syswin.com 北京思源集团 http://www.syswin.com/
1. szhittech.com 深圳和而泰智能控制股份有限公司. http://www.szhittech.com/
1. Tencent
1. Tetrate.io https://www.tetrate.io/
1. Thomas Cook 托迈酷客 https://www.thomascook.com.cn
1. Three Squirrels 三只松鼠
1. Today36524.com Today便利店
1. Tongcheng. 同城金服. https://jr.ly.com/
1. Tools information technology co. LTD 杭州图尔兹信息技术有限公司 http://bintools.cn/
1. TravelSky 中国航信 http://www.travelsky.net/
1. Tsfinance.com 重庆宜迅联供应链科技有限公司 https://www.tsfinance.com.cn/
1. tuhu.cn 途虎养车 https://www.tuhu.cn
1. Tuya. 涂鸦智能. https://www.tuya.com
1. Tydic 天源迪科 https://www.tydic.com/
1. VBill Payment Co., LTD. 随行付. https://www.vbill.cn/
1. Wahaha Group 娃哈哈 http://www.wahaha.com.cn/
1. WeBank. 微众银行 http://www.webank.com
1. Weier. 广州文尔软件科技有限公司. https://www.site0.cn
1. Wochu. 我厨买菜. https://www.wochu.cn
1. Xiaomi. 小米. https://www.mi.com/en/
1. xin.com 优信集团 http://www.xin.com
1. Xinyebang.com 重庆欣业邦网络技术有限公司 http://www.xinyebang.com
1. xueqiu.com 雪球财经. https://xueqiu.com/
1. yibainetwork.com 深圳易佰网络有限公司 http://www.yibainetwork.com/
1. Yiguo. 易果生鲜. http://www.yiguo.com/
1. Yinji(shenzhen)Network Technology Co.,Ltd. 印记. http://www.yinjiyun.cn/
1. Yonghui Superstores Co., Ltd. 永辉超市 http://www.yonghui.com.cn
1. Yonyou 用友
1. Youzan.com 杭州有赞科技有限公司 http://www.youzan.com/
1. Yunda Express 韵达快运 http://www.yunda56.com/
1. Yunnan Airport Group Co.,Ltd. 云南机场集团
1. yxt 云学堂 http://www.yxt.com/
1. zbj.com 猪八戒 https://www.zbj.com/
1. zhaopin.com 智联招聘 https://www.zhaopin.com/
1. zjs.com.cn 北京宅急送快运股份有限公司 http://www.zjs.com.cn/
# Use Cases
## Alibaba and Alibaba Cloud
Alibaba products including [Cloud DevOps product](https://cn.aliyun.com/product/yunxiao) are under SkyWalking monitoring.
Alibaba Cloud supports SkyWalking agents and formats in Tracing Analysis cloud service.
## China Eastern Airlines
Integrated in the microservices architecture support platform.
## China Merchants Bank
Use SkyWalking and [SkyAPM .net agent](https://github.com/SkyAPM/SkyAPM-dotnet) in the CMBChina Mall project.
## China Mobile
China Mobile Suzhou Research Center, CMSS, integrated SkyWalking as the APM component in China Mobile PAAS.
## ke.com
Deploy SkyWalking in production environments.
- Three CentOs Machines(32 CPUs, 64G RAM, 1.3T Disk) for Collector Server
- Three ElasticSearch(Version 6.4.2, 40 CPUs, 96G RAM, 7T Disk) Nodes for Storage
Support 60+ Instances, Over 300k Calls Per Minute, Over 50k Spans Per Second
## guazi.com
Guazi.com uses SkyWalking monitoring 270+ services,
including topology + metrics analysis, and collecting 1.1+ billion traces per day with 100% sampling.
Plan is 1k+ services and 5 billion traces per day.
## Oscart
Use multiple language agents from SkyWalking and its ecosystem, including SkyWalking Javaagent and [SkyAPM nodejs agent](https://github.com/SkyAPM/SkyAPM-nodejs). SkyWalking OAP platform acts as backend and visualization.
## Primeton
Integrated in Primeton EOS PLATFORM 8, which is a commercial micro-service platform.
## Qiniu Cloud
Provide a customized version SkyWalking agent. It could provide distributed tracing and integrated in its intelligence log management platform.
## Source++
An open-source observant programming assistant which aims to bridge APM tools with the developer's IDE to enable tighter feedback loops. Source++ uses SkyWalking as the defacto APM for JVM-based applications.
## Tetrate
Tetrate provides enterprise level service mesh. SkyWalking acts as the core observability platform for hybrid
enterprise service mesh environment.
## lagou.com
Lagou.com use Skywalking for JVM-based applications, deployed in production. Custom and optimize muiti collector functions, such as alarm, sql metric, circle operation metric, thread monitor, detail mode. Support 200+ Instances, over 4500k Segments Per Minute.
## Yonghui Superstores
Yonghui Superstores Co., Ltd. use SkyWalking as primary APM system, to monitor 1k+ instances clusters, which supports 150k+ tps/qps payload. SkyWalking collect, analysis and save 10 billions trace segments(cost 3T disk) each day in 100% sampling strategy. SkyWalking backend cluster is built with 15 nodes OAP and 20 nodes ElasticSearch.
...@@ -154,7 +154,7 @@ storage: ...@@ -154,7 +154,7 @@ storage:
advanced: ${SW_STORAGE_ES_ADVANCED:""} advanced: ${SW_STORAGE_ES_ADVANCED:""}
h2: h2:
driver: ${SW_STORAGE_H2_DRIVER:org.h2.jdbcx.JdbcDataSource} driver: ${SW_STORAGE_H2_DRIVER:org.h2.jdbcx.JdbcDataSource}
url: ${SW_STORAGE_H2_URL:jdbc:h2:mem:skywalking-oap-db} url: ${SW_STORAGE_H2_URL:jdbc:h2:mem:skywalking-oap-db;DB_CLOSE_DELAY=-1}
user: ${SW_STORAGE_H2_USER:sa} user: ${SW_STORAGE_H2_USER:sa}
metadataQueryMaxSize: ${SW_STORAGE_H2_QUERY_MAX_SIZE:5000} metadataQueryMaxSize: ${SW_STORAGE_H2_QUERY_MAX_SIZE:5000}
maxSizeOfArrayColumn: ${SW_STORAGE_MAX_SIZE_OF_ARRAY_COLUMN:20} maxSizeOfArrayColumn: ${SW_STORAGE_MAX_SIZE_OF_ARRAY_COLUMN:20}
......
...@@ -404,7 +404,7 @@ SmartSql: ...@@ -404,7 +404,7 @@ SmartSql:
HttpServer: HttpServer:
id: 4001 id: 4001
languages: Node.js languages: Node.js
express: Express:
id: 4002 id: 4002
languages: Node.js languages: Node.js
Egg: Egg:
...@@ -413,6 +413,9 @@ Egg: ...@@ -413,6 +413,9 @@ Egg:
Koa: Koa:
id: 4004 id: 4004
languages: Node.js languages: Node.js
Axios:
id: 4005
languages: Node.js
# Golang components # Golang components
# [5000, 6000) for Golang agent # [5000, 6000) for Golang agent
...@@ -476,6 +479,12 @@ Urllib3: ...@@ -476,6 +479,12 @@ Urllib3:
Sanic: Sanic:
id: 7007 id: 7007
languages: Python languages: Python
AioHttp:
id: 7008
languages: Python
Pyramid:
id: 7009
languages: Python
# PHP components # PHP components
# [8000, 9000) for PHP agent # [8000, 9000) for PHP agent
...@@ -504,6 +513,15 @@ EnvoyProxy: ...@@ -504,6 +513,15 @@ EnvoyProxy:
id: 9000 id: 9000
languages: C++ languages: C++
# Javascript components
# [10000, 11000) for Javascript agent
JavaScript:
id: 10000
languages: JavaScript
ajax:
id: 10001
languages: JavaScript
# Component Server mapping defines the server display names of some components # Component Server mapping defines the server display names of some components
# e.g. # e.g.
# Jedis is a client library in Java for Redis server # Jedis is a client library in Java for Redis server
......
...@@ -28,6 +28,7 @@ import org.apache.skywalking.oap.server.library.module.ModuleProvider; ...@@ -28,6 +28,7 @@ import org.apache.skywalking.oap.server.library.module.ModuleProvider;
import org.apache.skywalking.oap.server.library.module.ModuleStartException; import org.apache.skywalking.oap.server.library.module.ModuleStartException;
import org.apache.skywalking.oap.server.library.module.ServiceNotProvidedException; import org.apache.skywalking.oap.server.library.module.ServiceNotProvidedException;
import org.apache.skywalking.oap.server.receiver.browser.module.BrowserModule; import org.apache.skywalking.oap.server.receiver.browser.module.BrowserModule;
import org.apache.skywalking.oap.server.receiver.browser.provider.handler.grpc.BrowserPerfServiceHandlerCompat;
import org.apache.skywalking.oap.server.receiver.browser.provider.handler.grpc.BrowserPerfServiceHandler; import org.apache.skywalking.oap.server.receiver.browser.provider.handler.grpc.BrowserPerfServiceHandler;
import org.apache.skywalking.oap.server.receiver.browser.provider.handler.rest.BrowserErrorLogReportListServletHandler; import org.apache.skywalking.oap.server.receiver.browser.provider.handler.rest.BrowserErrorLogReportListServletHandler;
import org.apache.skywalking.oap.server.receiver.browser.provider.handler.rest.BrowserErrorLogReportSingleServletHandler; import org.apache.skywalking.oap.server.receiver.browser.provider.handler.rest.BrowserErrorLogReportSingleServletHandler;
...@@ -74,9 +75,10 @@ public class BrowserModuleProvider extends ModuleProvider { ...@@ -74,9 +75,10 @@ public class BrowserModuleProvider extends ModuleProvider {
GRPCHandlerRegister grpcHandlerRegister = getManager().find(SharingServerModule.NAME) GRPCHandlerRegister grpcHandlerRegister = getManager().find(SharingServerModule.NAME)
.provider().getService(GRPCHandlerRegister.class); .provider().getService(GRPCHandlerRegister.class);
// grpc // grpc
grpcHandlerRegister.addHandler( BrowserPerfServiceHandler browserPerfServiceHandler = new BrowserPerfServiceHandler(
new BrowserPerfServiceHandler( getManager(), moduleConfig, perfDataListenerManager(), errorLogListenerManager());
getManager(), moduleConfig, perfDataListenerManager(), errorLogListenerManager())); grpcHandlerRegister.addHandler(browserPerfServiceHandler);
grpcHandlerRegister.addHandler(new BrowserPerfServiceHandlerCompat(browserPerfServiceHandler));
// rest // rest
JettyHandlerRegister jettyHandlerRegister = getManager().find(SharingServerModule.NAME) JettyHandlerRegister jettyHandlerRegister = getManager().find(SharingServerModule.NAME)
......
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.skywalking.oap.server.receiver.browser.provider.handler.grpc;
import io.grpc.stub.StreamObserver;
import lombok.RequiredArgsConstructor;
import org.apache.skywalking.apm.network.common.v3.Commands;
import org.apache.skywalking.apm.network.language.agent.v3.BrowserErrorLog;
import org.apache.skywalking.apm.network.language.agent.v3.BrowserPerfData;
import org.apache.skywalking.apm.network.language.agent.v3.compat.BrowserPerfServiceGrpc;
import org.apache.skywalking.oap.server.library.server.grpc.GRPCHandler;
@RequiredArgsConstructor
public class BrowserPerfServiceHandlerCompat extends BrowserPerfServiceGrpc.BrowserPerfServiceImplBase implements GRPCHandler {
private final BrowserPerfServiceHandler delegate;
@Override
public void collectPerfData(final BrowserPerfData request, final StreamObserver<Commands> responseObserver) {
delegate.collectPerfData(request, responseObserver);
}
@Override
public StreamObserver<BrowserErrorLog> collectErrorLogs(final StreamObserver<Commands> responseObserver) {
return delegate.collectErrorLogs(responseObserver);
}
}
...@@ -27,6 +27,7 @@ import org.apache.skywalking.oap.server.library.module.ModuleProvider; ...@@ -27,6 +27,7 @@ import org.apache.skywalking.oap.server.library.module.ModuleProvider;
import org.apache.skywalking.oap.server.library.module.ModuleStartException; import org.apache.skywalking.oap.server.library.module.ModuleStartException;
import org.apache.skywalking.oap.server.library.module.ServiceNotProvidedException; import org.apache.skywalking.oap.server.library.module.ServiceNotProvidedException;
import org.apache.skywalking.oap.server.receiver.clr.module.CLRModule; import org.apache.skywalking.oap.server.receiver.clr.module.CLRModule;
import org.apache.skywalking.oap.server.receiver.clr.provider.handler.CLRMetricReportServiceHandlerCompat;
import org.apache.skywalking.oap.server.receiver.clr.provider.handler.CLRMetricReportServiceHandler; import org.apache.skywalking.oap.server.receiver.clr.provider.handler.CLRMetricReportServiceHandler;
import org.apache.skywalking.oap.server.receiver.sharing.server.SharingServerModule; import org.apache.skywalking.oap.server.receiver.sharing.server.SharingServerModule;
...@@ -66,7 +67,9 @@ public class CLRModuleProvider extends ModuleProvider { ...@@ -66,7 +67,9 @@ public class CLRModuleProvider extends ModuleProvider {
GRPCHandlerRegister grpcHandlerRegister = getManager().find(SharingServerModule.NAME) GRPCHandlerRegister grpcHandlerRegister = getManager().find(SharingServerModule.NAME)
.provider() .provider()
.getService(GRPCHandlerRegister.class); .getService(GRPCHandlerRegister.class);
grpcHandlerRegister.addHandler(new CLRMetricReportServiceHandler(getManager())); CLRMetricReportServiceHandler clrMetricReportServiceHandler = new CLRMetricReportServiceHandler(getManager());
grpcHandlerRegister.addHandler(clrMetricReportServiceHandler);
grpcHandlerRegister.addHandler(new CLRMetricReportServiceHandlerCompat(clrMetricReportServiceHandler));
} }
@Override @Override
......
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.oap.server.receiver.clr.provider.handler;
import io.grpc.stub.StreamObserver;
import lombok.RequiredArgsConstructor;
import org.apache.skywalking.apm.network.common.v3.Commands;
import org.apache.skywalking.apm.network.language.agent.v3.CLRMetricCollection;
import org.apache.skywalking.apm.network.language.agent.v3.compat.CLRMetricReportServiceGrpc;
import org.apache.skywalking.oap.server.library.server.grpc.GRPCHandler;
@RequiredArgsConstructor
public class CLRMetricReportServiceHandlerCompat extends CLRMetricReportServiceGrpc.CLRMetricReportServiceImplBase implements GRPCHandler {
private final CLRMetricReportServiceHandler delegate;
@Override
public void collect(final CLRMetricCollection request, final StreamObserver<Commands> responseObserver) {
delegate.collect(request, responseObserver);
}
}
...@@ -27,6 +27,7 @@ import org.apache.skywalking.oap.server.library.module.ModuleProvider; ...@@ -27,6 +27,7 @@ import org.apache.skywalking.oap.server.library.module.ModuleProvider;
import org.apache.skywalking.oap.server.library.module.ModuleStartException; import org.apache.skywalking.oap.server.library.module.ModuleStartException;
import org.apache.skywalking.oap.server.receiver.jvm.module.JVMModule; import org.apache.skywalking.oap.server.receiver.jvm.module.JVMModule;
import org.apache.skywalking.oap.server.receiver.jvm.provider.handler.JVMMetricReportServiceHandler; import org.apache.skywalking.oap.server.receiver.jvm.provider.handler.JVMMetricReportServiceHandler;
import org.apache.skywalking.oap.server.receiver.jvm.provider.handler.JVMMetricReportServiceHandlerCompat;
import org.apache.skywalking.oap.server.receiver.sharing.server.SharingServerModule; import org.apache.skywalking.oap.server.receiver.sharing.server.SharingServerModule;
public class JVMModuleProvider extends ModuleProvider { public class JVMModuleProvider extends ModuleProvider {
...@@ -61,7 +62,9 @@ public class JVMModuleProvider extends ModuleProvider { ...@@ -61,7 +62,9 @@ public class JVMModuleProvider extends ModuleProvider {
GRPCHandlerRegister grpcHandlerRegister = getManager().find(SharingServerModule.NAME) GRPCHandlerRegister grpcHandlerRegister = getManager().find(SharingServerModule.NAME)
.provider() .provider()
.getService(GRPCHandlerRegister.class); .getService(GRPCHandlerRegister.class);
grpcHandlerRegister.addHandler(new JVMMetricReportServiceHandler(getManager())); JVMMetricReportServiceHandler jvmMetricReportServiceHandler = new JVMMetricReportServiceHandler(getManager());
grpcHandlerRegister.addHandler(jvmMetricReportServiceHandler);
grpcHandlerRegister.addHandler(new JVMMetricReportServiceHandlerCompat(jvmMetricReportServiceHandler));
} }
@Override @Override
......
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.oap.server.receiver.jvm.provider.handler;
import io.grpc.stub.StreamObserver;
import lombok.RequiredArgsConstructor;
import org.apache.skywalking.apm.network.common.v3.Commands;
import org.apache.skywalking.apm.network.language.agent.v3.JVMMetricCollection;
import org.apache.skywalking.apm.network.language.agent.v3.compat.JVMMetricReportServiceGrpc;
import org.apache.skywalking.oap.server.library.server.grpc.GRPCHandler;
@RequiredArgsConstructor
public class JVMMetricReportServiceHandlerCompat extends JVMMetricReportServiceGrpc.JVMMetricReportServiceImplBase implements GRPCHandler {
private final JVMMetricReportServiceHandler delegate;
@Override
public void collect(final JVMMetricCollection request, final StreamObserver<Commands> responseObserver) {
delegate.collect(request, responseObserver);
}
}
...@@ -26,6 +26,7 @@ import org.apache.skywalking.oap.server.library.module.ModuleDefine; ...@@ -26,6 +26,7 @@ import org.apache.skywalking.oap.server.library.module.ModuleDefine;
import org.apache.skywalking.oap.server.library.module.ModuleProvider; import org.apache.skywalking.oap.server.library.module.ModuleProvider;
import org.apache.skywalking.oap.server.receiver.register.module.RegisterModule; import org.apache.skywalking.oap.server.receiver.register.module.RegisterModule;
import org.apache.skywalking.oap.server.receiver.register.provider.handler.v8.grpc.ManagementServiceHandler; import org.apache.skywalking.oap.server.receiver.register.provider.handler.v8.grpc.ManagementServiceHandler;
import org.apache.skywalking.oap.server.receiver.register.provider.handler.v8.grpc.ManagementServiceHandlerCompat;
import org.apache.skywalking.oap.server.receiver.register.provider.handler.v8.rest.ManagementServiceKeepAliveHandler; import org.apache.skywalking.oap.server.receiver.register.provider.handler.v8.rest.ManagementServiceKeepAliveHandler;
import org.apache.skywalking.oap.server.receiver.register.provider.handler.v8.rest.ManagementServiceReportPropertiesHandler; import org.apache.skywalking.oap.server.receiver.register.provider.handler.v8.rest.ManagementServiceReportPropertiesHandler;
import org.apache.skywalking.oap.server.receiver.sharing.server.SharingServerModule; import org.apache.skywalking.oap.server.receiver.sharing.server.SharingServerModule;
...@@ -56,7 +57,9 @@ public class RegisterModuleProvider extends ModuleProvider { ...@@ -56,7 +57,9 @@ public class RegisterModuleProvider extends ModuleProvider {
GRPCHandlerRegister grpcHandlerRegister = getManager().find(SharingServerModule.NAME) GRPCHandlerRegister grpcHandlerRegister = getManager().find(SharingServerModule.NAME)
.provider() .provider()
.getService(GRPCHandlerRegister.class); .getService(GRPCHandlerRegister.class);
grpcHandlerRegister.addHandler(new ManagementServiceHandler(getManager())); ManagementServiceHandler managementServiceHandler = new ManagementServiceHandler(getManager());
grpcHandlerRegister.addHandler(managementServiceHandler);
grpcHandlerRegister.addHandler(new ManagementServiceHandlerCompat(managementServiceHandler));
JettyHandlerRegister jettyHandlerRegister = getManager().find(SharingServerModule.NAME) JettyHandlerRegister jettyHandlerRegister = getManager().find(SharingServerModule.NAME)
.provider() .provider()
......
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.oap.server.receiver.register.provider.handler.v8.grpc;
import io.grpc.stub.StreamObserver;
import lombok.RequiredArgsConstructor;
import org.apache.skywalking.apm.network.common.v3.Commands;
import org.apache.skywalking.apm.network.management.v3.InstancePingPkg;
import org.apache.skywalking.apm.network.management.v3.InstanceProperties;
import org.apache.skywalking.apm.network.management.v3.compat.ManagementServiceGrpc;
import org.apache.skywalking.oap.server.library.server.grpc.GRPCHandler;
@RequiredArgsConstructor
public class ManagementServiceHandlerCompat extends ManagementServiceGrpc.ManagementServiceImplBase implements GRPCHandler {
private final ManagementServiceHandler delegate;
@Override
public void reportInstanceProperties(final InstanceProperties request, final StreamObserver<Commands> responseObserver) {
delegate.reportInstanceProperties(request, responseObserver);
}
@Override
public void keepAlive(final InstancePingPkg request, final StreamObserver<Commands> responseObserver) {
delegate.keepAlive(request, responseObserver);
}
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.aop.server.receiver.mesh;
import io.grpc.stub.StreamObserver;
import lombok.RequiredArgsConstructor;
import org.apache.skywalking.apm.network.servicemesh.v3.MeshProbeDownstream;
import org.apache.skywalking.apm.network.servicemesh.v3.ServiceMeshMetric;
import org.apache.skywalking.apm.network.servicemesh.v3.compat.ServiceMeshMetricServiceGrpc;
@RequiredArgsConstructor
public class MeshGRPCHandlerCompat extends ServiceMeshMetricServiceGrpc.ServiceMeshMetricServiceImplBase {
private final MeshGRPCHandler delegate;
@Override
public StreamObserver<ServiceMeshMetric> collect(final StreamObserver<MeshProbeDownstream> responseObserver) {
return delegate.collect(responseObserver);
}
}
...@@ -68,7 +68,9 @@ public class MeshReceiverProvider extends ModuleProvider { ...@@ -68,7 +68,9 @@ public class MeshReceiverProvider extends ModuleProvider {
GRPCHandlerRegister service = getManager().find(SharingServerModule.NAME) GRPCHandlerRegister service = getManager().find(SharingServerModule.NAME)
.provider() .provider()
.getService(GRPCHandlerRegister.class); .getService(GRPCHandlerRegister.class);
service.addHandler(new MeshGRPCHandler(getManager())); MeshGRPCHandler meshGRPCHandler = new MeshGRPCHandler(getManager());
service.addHandler(meshGRPCHandler);
service.addHandler(new MeshGRPCHandlerCompat(meshGRPCHandler));
} }
@Override @Override
......
...@@ -28,6 +28,7 @@ import org.apache.skywalking.oap.server.library.module.ModuleProvider; ...@@ -28,6 +28,7 @@ import org.apache.skywalking.oap.server.library.module.ModuleProvider;
import org.apache.skywalking.oap.server.library.module.ServiceNotProvidedException; import org.apache.skywalking.oap.server.library.module.ServiceNotProvidedException;
import org.apache.skywalking.oap.server.receiver.meter.module.MeterReceiverModule; import org.apache.skywalking.oap.server.receiver.meter.module.MeterReceiverModule;
import org.apache.skywalking.oap.server.receiver.meter.provider.handler.MeterServiceHandler; import org.apache.skywalking.oap.server.receiver.meter.provider.handler.MeterServiceHandler;
import org.apache.skywalking.oap.server.receiver.meter.provider.handler.MeterServiceHandlerCompat;
import org.apache.skywalking.oap.server.receiver.sharing.server.SharingServerModule; import org.apache.skywalking.oap.server.receiver.sharing.server.SharingServerModule;
public class MeterReceiverProvider extends ModuleProvider { public class MeterReceiverProvider extends ModuleProvider {
...@@ -61,7 +62,9 @@ public class MeterReceiverProvider extends ModuleProvider { ...@@ -61,7 +62,9 @@ public class MeterReceiverProvider extends ModuleProvider {
GRPCHandlerRegister grpcHandlerRegister = getManager().find(SharingServerModule.NAME) GRPCHandlerRegister grpcHandlerRegister = getManager().find(SharingServerModule.NAME)
.provider() .provider()
.getService(GRPCHandlerRegister.class); .getService(GRPCHandlerRegister.class);
grpcHandlerRegister.addHandler(new MeterServiceHandler(processService)); MeterServiceHandler meterServiceHandlerCompat = new MeterServiceHandler(processService);
grpcHandlerRegister.addHandler(meterServiceHandlerCompat);
grpcHandlerRegister.addHandler(new MeterServiceHandlerCompat(meterServiceHandlerCompat));
} }
@Override @Override
......
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.oap.server.receiver.meter.provider.handler;
import io.grpc.stub.StreamObserver;
import lombok.RequiredArgsConstructor;
import org.apache.skywalking.apm.network.common.v3.Commands;
import org.apache.skywalking.apm.network.language.agent.v3.MeterData;
import org.apache.skywalking.apm.network.language.agent.v3.compat.MeterReportServiceGrpc;
import org.apache.skywalking.oap.server.library.server.grpc.GRPCHandler;
@RequiredArgsConstructor
public class MeterServiceHandlerCompat extends MeterReportServiceGrpc.MeterReportServiceImplBase implements GRPCHandler {
private final MeterServiceHandler delegate;
@Override
public StreamObserver<MeterData> collect(final StreamObserver<Commands> responseObserver) {
return delegate.collect(responseObserver);
}
}
...@@ -27,6 +27,7 @@ import org.apache.skywalking.oap.server.library.module.ModuleStartException; ...@@ -27,6 +27,7 @@ import org.apache.skywalking.oap.server.library.module.ModuleStartException;
import org.apache.skywalking.oap.server.library.module.ServiceNotProvidedException; import org.apache.skywalking.oap.server.library.module.ServiceNotProvidedException;
import org.apache.skywalking.oap.server.receiver.profile.module.ProfileModule; import org.apache.skywalking.oap.server.receiver.profile.module.ProfileModule;
import org.apache.skywalking.oap.server.receiver.profile.provider.handler.ProfileTaskServiceHandler; import org.apache.skywalking.oap.server.receiver.profile.provider.handler.ProfileTaskServiceHandler;
import org.apache.skywalking.oap.server.receiver.profile.provider.handler.ProfileTaskServiceHandlerCompat;
import org.apache.skywalking.oap.server.receiver.sharing.server.SharingServerModule; import org.apache.skywalking.oap.server.receiver.sharing.server.SharingServerModule;
/** /**
...@@ -57,7 +58,9 @@ public class ProfileModuleProvider extends ModuleProvider { ...@@ -57,7 +58,9 @@ public class ProfileModuleProvider extends ModuleProvider {
GRPCHandlerRegister grpcHandlerRegister = getManager().find(SharingServerModule.NAME) GRPCHandlerRegister grpcHandlerRegister = getManager().find(SharingServerModule.NAME)
.provider() .provider()
.getService(GRPCHandlerRegister.class); .getService(GRPCHandlerRegister.class);
grpcHandlerRegister.addHandler(new ProfileTaskServiceHandler(getManager())); ProfileTaskServiceHandler profileTaskServiceHandler = new ProfileTaskServiceHandler(getManager());
grpcHandlerRegister.addHandler(profileTaskServiceHandler);
grpcHandlerRegister.addHandler(new ProfileTaskServiceHandlerCompat(profileTaskServiceHandler));
} }
@Override @Override
......
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.oap.server.receiver.profile.provider.handler;
import io.grpc.stub.StreamObserver;
import lombok.RequiredArgsConstructor;
import org.apache.skywalking.apm.network.common.v3.Commands;
import org.apache.skywalking.apm.network.language.profile.v3.ProfileTaskCommandQuery;
import org.apache.skywalking.apm.network.language.profile.v3.ProfileTaskFinishReport;
import org.apache.skywalking.apm.network.language.profile.v3.ThreadSnapshot;
import org.apache.skywalking.apm.network.language.profile.v3.compat.ProfileTaskGrpc;
import org.apache.skywalking.oap.server.library.server.grpc.GRPCHandler;
@RequiredArgsConstructor
public class ProfileTaskServiceHandlerCompat extends ProfileTaskGrpc.ProfileTaskImplBase implements GRPCHandler {
private final ProfileTaskServiceHandler delegate;
@Override
public void getProfileTaskCommands(final ProfileTaskCommandQuery request, final StreamObserver<Commands> responseObserver) {
delegate.getProfileTaskCommands(request, responseObserver);
}
@Override
public StreamObserver<ThreadSnapshot> collectSnapshot(final StreamObserver<Commands> responseObserver) {
return delegate.collectSnapshot(responseObserver);
}
@Override
public void reportTaskFinish(final ProfileTaskFinishReport request, final StreamObserver<Commands> responseObserver) {
delegate.reportTaskFinish(request, responseObserver);
}
}
...@@ -30,6 +30,7 @@ import org.apache.skywalking.oap.server.library.module.ServiceNotProvidedExcepti ...@@ -30,6 +30,7 @@ import org.apache.skywalking.oap.server.library.module.ServiceNotProvidedExcepti
import org.apache.skywalking.oap.server.receiver.sharing.server.SharingServerModule; import org.apache.skywalking.oap.server.receiver.sharing.server.SharingServerModule;
import org.apache.skywalking.oap.server.receiver.trace.module.TraceModule; import org.apache.skywalking.oap.server.receiver.trace.module.TraceModule;
import org.apache.skywalking.oap.server.receiver.trace.provider.handler.v8.grpc.TraceSegmentReportServiceHandler; import org.apache.skywalking.oap.server.receiver.trace.provider.handler.v8.grpc.TraceSegmentReportServiceHandler;
import org.apache.skywalking.oap.server.receiver.trace.provider.handler.v8.grpc.TraceSegmentReportServiceHandlerCompat;
import org.apache.skywalking.oap.server.receiver.trace.provider.handler.v8.rest.TraceSegmentReportListServletHandler; import org.apache.skywalking.oap.server.receiver.trace.provider.handler.v8.rest.TraceSegmentReportListServletHandler;
import org.apache.skywalking.oap.server.receiver.trace.provider.handler.v8.rest.TraceSegmentReportSingleServletHandler; import org.apache.skywalking.oap.server.receiver.trace.provider.handler.v8.rest.TraceSegmentReportSingleServletHandler;
import org.apache.skywalking.oap.server.telemetry.TelemetryModule; import org.apache.skywalking.oap.server.telemetry.TelemetryModule;
...@@ -65,7 +66,9 @@ public class TraceModuleProvider extends ModuleProvider { ...@@ -65,7 +66,9 @@ public class TraceModuleProvider extends ModuleProvider {
.provider() .provider()
.getService(JettyHandlerRegister.class); .getService(JettyHandlerRegister.class);
grpcHandlerRegister.addHandler(new TraceSegmentReportServiceHandler(getManager())); TraceSegmentReportServiceHandler traceSegmentReportServiceHandler = new TraceSegmentReportServiceHandler(getManager());
grpcHandlerRegister.addHandler(traceSegmentReportServiceHandler);
grpcHandlerRegister.addHandler(new TraceSegmentReportServiceHandlerCompat(traceSegmentReportServiceHandler));
jettyHandlerRegister.addHandler(new TraceSegmentReportListServletHandler(getManager())); jettyHandlerRegister.addHandler(new TraceSegmentReportListServletHandler(getManager()));
jettyHandlerRegister.addHandler(new TraceSegmentReportSingleServletHandler(getManager())); jettyHandlerRegister.addHandler(new TraceSegmentReportSingleServletHandler(getManager()));
......
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.oap.server.receiver.trace.provider.handler.v8.grpc;
import io.grpc.stub.StreamObserver;
import lombok.RequiredArgsConstructor;
import org.apache.skywalking.apm.network.common.v3.Commands;
import org.apache.skywalking.apm.network.language.agent.v3.SegmentCollection;
import org.apache.skywalking.apm.network.language.agent.v3.SegmentObject;
import org.apache.skywalking.apm.network.language.agent.v3.compat.TraceSegmentReportServiceGrpc;
import org.apache.skywalking.oap.server.library.server.grpc.GRPCHandler;
@RequiredArgsConstructor
public class TraceSegmentReportServiceHandlerCompat extends TraceSegmentReportServiceGrpc.TraceSegmentReportServiceImplBase implements GRPCHandler {
private final TraceSegmentReportServiceHandler delegate;
@Override
public StreamObserver<SegmentObject> collect(final StreamObserver<Commands> responseObserver) {
return delegate.collect(responseObserver);
}
@Override
public void collectInSync(final SegmentCollection request, final StreamObserver<Commands> responseObserver) {
delegate.collectInSync(request, responseObserver);
}
}
...@@ -46,17 +46,13 @@ import static org.influxdb.querybuilder.BuiltQuery.QueryBuilder.ti; ...@@ -46,17 +46,13 @@ import static org.influxdb.querybuilder.BuiltQuery.QueryBuilder.ti;
*/ */
@Slf4j @Slf4j
public class InfluxClient implements Client, HealthCheckable { public class InfluxClient implements Client, HealthCheckable {
private InfluxStorageConfig config; private final DelegatedHealthChecker healthChecker = new DelegatedHealthChecker();
private final InfluxStorageConfig config;
private InfluxDB influx; private InfluxDB influx;
private DelegatedHealthChecker healthChecker = new DelegatedHealthChecker();
/** /**
* A constant, the name of time field in Time-series database. * A constant, the name of time field in Time-series database.
*/ */
public static final String TIME = "time"; public static final String TIME = "time";
/**
* A constant, the name of tag of time_bucket.
*/
public static final String TAG_TIME_BUCKET = "_time_bucket";
private final String database; private final String database;
...@@ -217,7 +213,7 @@ public class InfluxClient implements Client, HealthCheckable { ...@@ -217,7 +213,7 @@ public class InfluxClient implements Client, HealthCheckable {
this.healthChecker.health(); this.healthChecker.health();
} catch (Throwable e) { } catch (Throwable e) {
healthChecker.unHealth(e); healthChecker.unHealth(e);
throw e; throw new IOException(e);
} }
} }
......
...@@ -69,7 +69,7 @@ import org.apache.skywalking.oap.server.telemetry.api.MetricsTag; ...@@ -69,7 +69,7 @@ import org.apache.skywalking.oap.server.telemetry.api.MetricsTag;
@Slf4j @Slf4j
public class InfluxStorageProvider extends ModuleProvider { public class InfluxStorageProvider extends ModuleProvider {
private InfluxStorageConfig config; private final InfluxStorageConfig config;
private InfluxClient client; private InfluxClient client;
public InfluxStorageProvider() { public InfluxStorageProvider() {
...@@ -123,8 +123,11 @@ public class InfluxStorageProvider extends ModuleProvider { ...@@ -123,8 +123,11 @@ public class InfluxStorageProvider extends ModuleProvider {
@Override @Override
public void start() throws ServiceNotProvidedException, ModuleStartException { public void start() throws ServiceNotProvidedException, ModuleStartException {
MetricsCreator metricCreator = getManager().find(TelemetryModule.NAME).provider().getService(MetricsCreator.class); MetricsCreator metricCreator = getManager().find(TelemetryModule.NAME)
HealthCheckMetrics healthChecker = metricCreator.createHealthCheckerGauge("storage_influxdb", MetricsTag.EMPTY_KEY, MetricsTag.EMPTY_VALUE); .provider()
.getService(MetricsCreator.class);
HealthCheckMetrics healthChecker = metricCreator.createHealthCheckerGauge(
"storage_influxdb", MetricsTag.EMPTY_KEY, MetricsTag.EMPTY_VALUE);
client.registerChecker(healthChecker); client.registerChecker(healthChecker);
try { try {
client.connect(); client.connect();
...@@ -137,7 +140,7 @@ public class InfluxStorageProvider extends ModuleProvider { ...@@ -137,7 +140,7 @@ public class InfluxStorageProvider extends ModuleProvider {
} }
@Override @Override
public void notifyAfterCompleted() throws ServiceNotProvidedException, ModuleStartException { public void notifyAfterCompleted() throws ServiceNotProvidedException {
} }
......
...@@ -18,7 +18,6 @@ ...@@ -18,7 +18,6 @@
package org.apache.skywalking.oap.server.storage.plugin.influxdb; package org.apache.skywalking.oap.server.storage.plugin.influxdb;
import org.apache.skywalking.oap.server.core.storage.StorageException;
import org.apache.skywalking.oap.server.core.storage.model.Model; import org.apache.skywalking.oap.server.core.storage.model.Model;
import org.apache.skywalking.oap.server.core.storage.model.ModelInstaller; import org.apache.skywalking.oap.server.core.storage.model.ModelInstaller;
import org.apache.skywalking.oap.server.library.client.Client; import org.apache.skywalking.oap.server.library.client.Client;
...@@ -31,13 +30,13 @@ public class InfluxTableInstaller extends ModelInstaller { ...@@ -31,13 +30,13 @@ public class InfluxTableInstaller extends ModelInstaller {
} }
@Override @Override
protected boolean isExists(final Model model) throws StorageException { protected boolean isExists(final Model model) {
TableMetaInfo.addModel(model); TableMetaInfo.addModel(model);
return true; return true;
} }
@Override @Override
protected void createTable(final Model model) throws StorageException { protected void createTable(final Model model) {
// Automatically create table // Automatically create table
} }
} }
...@@ -41,9 +41,9 @@ import org.apache.skywalking.oap.server.core.storage.model.ModelColumn; ...@@ -41,9 +41,9 @@ import org.apache.skywalking.oap.server.core.storage.model.ModelColumn;
public class TableMetaInfo { public class TableMetaInfo {
private static final Map<String, TableMetaInfo> TABLES = new HashMap<>(); private static final Map<String, TableMetaInfo> TABLES = new HashMap<>();
private Map<String, String> storageAndColumnMap; private final Map<String, String> storageAndColumnMap;
private Map<String, String> storageAndTagMap; private final Map<String, String> storageAndTagMap;
private Model model; private final Model model;
public static void addModel(Model model) { public static void addModel(Model model) {
final List<ModelColumn> columns = model.getColumns(); final List<ModelColumn> columns = model.getColumns();
...@@ -88,7 +88,7 @@ public class TableMetaInfo { ...@@ -88,7 +88,7 @@ public class TableMetaInfo {
} }
} }
TableMetaInfo info = TableMetaInfo.builder() final TableMetaInfo info = TableMetaInfo.builder()
.model(model) .model(model)
.storageAndTagMap(storageAndTagMap) .storageAndTagMap(storageAndTagMap)
.storageAndColumnMap(storageAndColumnMap) .storageAndColumnMap(storageAndColumnMap)
......
...@@ -36,17 +36,17 @@ import org.influxdb.dto.Point; ...@@ -36,17 +36,17 @@ import org.influxdb.dto.Point;
* InfluxDB Point wrapper. * InfluxDB Point wrapper.
*/ */
public class InfluxInsertRequest implements InsertRequest, UpdateRequest { public class InfluxInsertRequest implements InsertRequest, UpdateRequest {
private Point.Builder builder; private final Point.Builder builder;
private Map<String, Object> fields = Maps.newHashMap(); private final Map<String, Object> fields = Maps.newHashMap();
public InfluxInsertRequest(Model model, StorageData storageData, StorageBuilder storageBuilder) { public <T extends StorageData> InfluxInsertRequest(Model model, T storageData, StorageBuilder<T> storageBuilder) {
Map<String, Object> objectMap = storageBuilder.data2Map(storageData); final Map<String, Object> objectMap = storageBuilder.data2Map(storageData);
if (SegmentRecord.INDEX_NAME.equals(model.getName())) { if (SegmentRecord.INDEX_NAME.equals(model.getName())) {
objectMap.remove(SegmentRecord.TAGS); objectMap.remove(SegmentRecord.TAGS);
} }
for (ModelColumn column : model.getColumns()) { for (ModelColumn column : model.getColumns()) {
Object value = objectMap.get(column.getColumnName().getName()); final Object value = objectMap.get(column.getColumnName().getName());
if (value instanceof StorageDataComplexObject) { if (value instanceof StorageDataComplexObject) {
fields.put( fields.put(
......
...@@ -44,7 +44,7 @@ public class InfluxStorageDAO implements StorageDAO { ...@@ -44,7 +44,7 @@ public class InfluxStorageDAO implements StorageDAO {
@Override @Override
public IRecordDAO newRecordDao(StorageBuilder<Record> storageBuilder) { public IRecordDAO newRecordDao(StorageBuilder<Record> storageBuilder) {
return new RecordDAO(influxClient, storageBuilder); return new RecordDAO(storageBuilder);
} }
@Override @Override
......
...@@ -40,8 +40,8 @@ import static org.influxdb.querybuilder.BuiltQuery.QueryBuilder.select; ...@@ -40,8 +40,8 @@ import static org.influxdb.querybuilder.BuiltQuery.QueryBuilder.select;
@Slf4j @Slf4j
public class ManagementDAO implements IManagementDAO { public class ManagementDAO implements IManagementDAO {
private InfluxClient client; private final InfluxClient client;
private StorageBuilder<ManagementData> storageBuilder; private final StorageBuilder<ManagementData> storageBuilder;
public ManagementDAO(InfluxClient client, StorageBuilder<ManagementData> storageBuilder) { public ManagementDAO(InfluxClient client, StorageBuilder<ManagementData> storageBuilder) {
this.client = client; this.client = client;
......
...@@ -58,7 +58,7 @@ public class MetricsDAO implements IMetricsDAO { ...@@ -58,7 +58,7 @@ public class MetricsDAO implements IMetricsDAO {
@Override @Override
public List<Metrics> multiGet(Model model, List<String> ids) throws IOException { public List<Metrics> multiGet(Model model, List<String> ids) throws IOException {
WhereQueryImpl<SelectQueryImpl> query = select() final WhereQueryImpl<SelectQueryImpl> query = select()
.raw(ALL_FIELDS) .raw(ALL_FIELDS)
.from(client.getDatabase(), model.getName()) .from(client.getDatabase(), model.getName())
.where(contains("id", Joiner.on("|").join(ids))); .where(contains("id", Joiner.on("|").join(ids)));
...@@ -72,10 +72,10 @@ public class MetricsDAO implements IMetricsDAO { ...@@ -72,10 +72,10 @@ public class MetricsDAO implements IMetricsDAO {
} }
final List<Metrics> metrics = Lists.newArrayList(); final List<Metrics> metrics = Lists.newArrayList();
List<String> columns = series.getColumns(); final List<String> columns = series.getColumns();
TableMetaInfo metaInfo = TableMetaInfo.get(model.getName()); final TableMetaInfo metaInfo = TableMetaInfo.get(model.getName());
Map<String, String> storageAndColumnMap = metaInfo.getStorageAndColumnMap(); final Map<String, String> storageAndColumnMap = metaInfo.getStorageAndColumnMap();
series.getValues().forEach(values -> { series.getValues().forEach(values -> {
Map<String, Object> data = Maps.newHashMap(); Map<String, Object> data = Maps.newHashMap();
...@@ -96,21 +96,19 @@ public class MetricsDAO implements IMetricsDAO { ...@@ -96,21 +96,19 @@ public class MetricsDAO implements IMetricsDAO {
} }
@Override @Override
public InsertRequest prepareBatchInsert(Model model, Metrics metrics) throws IOException { public InsertRequest prepareBatchInsert(Model model, Metrics metrics) {
final long timestamp = TimeBucket.getTimestamp(metrics.getTimeBucket(), model.getDownsampling()); final long timestamp = TimeBucket.getTimestamp(metrics.getTimeBucket(), model.getDownsampling());
TableMetaInfo tableMetaInfo = TableMetaInfo.get(model.getName()); final TableMetaInfo tableMetaInfo = TableMetaInfo.get(model.getName());
final InfluxInsertRequest request = new InfluxInsertRequest(model, metrics, storageBuilder) final InfluxInsertRequest request = new InfluxInsertRequest(model, metrics, storageBuilder)
.time(timestamp, TimeUnit.MILLISECONDS); .time(timestamp, TimeUnit.MILLISECONDS);
tableMetaInfo.getStorageAndTagMap().forEach((field, tag) -> { tableMetaInfo.getStorageAndTagMap().forEach(request::addFieldAsTag);
request.addFieldAsTag(field, tag);
});
return request; return request;
} }
@Override @Override
public UpdateRequest prepareBatchUpdate(Model model, Metrics metrics) throws IOException { public UpdateRequest prepareBatchUpdate(Model model, Metrics metrics) {
return (UpdateRequest) this.prepareBatchInsert(model, metrics); return (UpdateRequest) this.prepareBatchInsert(model, metrics);
} }
} }
...@@ -18,7 +18,6 @@ ...@@ -18,7 +18,6 @@
package org.apache.skywalking.oap.server.storage.plugin.influxdb.base; package org.apache.skywalking.oap.server.storage.plugin.influxdb.base;
import java.io.IOException;
import java.util.concurrent.TimeUnit; import java.util.concurrent.TimeUnit;
import org.apache.skywalking.apm.commons.datacarrier.common.AtomicRangeInteger; import org.apache.skywalking.apm.commons.datacarrier.common.AtomicRangeInteger;
import org.apache.skywalking.oap.server.core.analysis.TimeBucket; import org.apache.skywalking.oap.server.core.analysis.TimeBucket;
...@@ -33,8 +32,8 @@ public class NoneStreamDAO implements INoneStreamDAO { ...@@ -33,8 +32,8 @@ public class NoneStreamDAO implements INoneStreamDAO {
private static final int PADDING_SIZE = 1_000_000; private static final int PADDING_SIZE = 1_000_000;
private static final AtomicRangeInteger SUFFIX = new AtomicRangeInteger(0, PADDING_SIZE); private static final AtomicRangeInteger SUFFIX = new AtomicRangeInteger(0, PADDING_SIZE);
private InfluxClient client; private final InfluxClient client;
private StorageBuilder<NoneStream> storageBuilder; private final StorageBuilder<NoneStream> storageBuilder;
public NoneStreamDAO(InfluxClient client, StorageBuilder<NoneStream> storageBuilder) { public NoneStreamDAO(InfluxClient client, StorageBuilder<NoneStream> storageBuilder) {
this.client = client; this.client = client;
...@@ -42,15 +41,13 @@ public class NoneStreamDAO implements INoneStreamDAO { ...@@ -42,15 +41,13 @@ public class NoneStreamDAO implements INoneStreamDAO {
} }
@Override @Override
public void insert(final Model model, final NoneStream noneStream) throws IOException { public void insert(final Model model, final NoneStream noneStream) {
final long timestamp = TimeBucket.getTimestamp(noneStream.getTimeBucket(), model.getDownsampling()) final long timestamp = TimeBucket.getTimestamp(noneStream.getTimeBucket(), model.getDownsampling())
* PADDING_SIZE + SUFFIX.getAndIncrement(); * PADDING_SIZE + SUFFIX.getAndIncrement();
final InfluxInsertRequest request = new InfluxInsertRequest(model, noneStream, storageBuilder) final InfluxInsertRequest request = new InfluxInsertRequest(model, noneStream, storageBuilder)
.time(timestamp, TimeUnit.NANOSECONDS); .time(timestamp, TimeUnit.NANOSECONDS);
TableMetaInfo.get(model.getName()).getStorageAndTagMap().forEach((field, tag) -> { TableMetaInfo.get(model.getName()).getStorageAndTagMap().forEach(request::addFieldAsTag);
request.addFieldAsTag(field, tag);
});
client.write(request.getPoint()); client.write(request.getPoint());
} }
} }
...@@ -19,7 +19,6 @@ ...@@ -19,7 +19,6 @@
package org.apache.skywalking.oap.server.storage.plugin.influxdb.base; package org.apache.skywalking.oap.server.storage.plugin.influxdb.base;
import com.google.common.base.Joiner; import com.google.common.base.Joiner;
import java.io.IOException;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.concurrent.TimeUnit; import java.util.concurrent.TimeUnit;
...@@ -33,44 +32,38 @@ import org.apache.skywalking.oap.server.core.storage.IRecordDAO; ...@@ -33,44 +32,38 @@ import org.apache.skywalking.oap.server.core.storage.IRecordDAO;
import org.apache.skywalking.oap.server.core.storage.StorageBuilder; import org.apache.skywalking.oap.server.core.storage.StorageBuilder;
import org.apache.skywalking.oap.server.core.storage.model.Model; import org.apache.skywalking.oap.server.core.storage.model.Model;
import org.apache.skywalking.oap.server.library.client.request.InsertRequest; import org.apache.skywalking.oap.server.library.client.request.InsertRequest;
import org.apache.skywalking.oap.server.storage.plugin.influxdb.InfluxClient;
import org.apache.skywalking.oap.server.storage.plugin.influxdb.TableMetaInfo; import org.apache.skywalking.oap.server.storage.plugin.influxdb.TableMetaInfo;
public class RecordDAO implements IRecordDAO { public class RecordDAO implements IRecordDAO {
private static final int PADDING_SIZE = 1_000_000; private static final int PADDING_SIZE = 1_000_000;
private static final AtomicRangeInteger SUFFIX = new AtomicRangeInteger(0, PADDING_SIZE); private static final AtomicRangeInteger SUFFIX = new AtomicRangeInteger(0, PADDING_SIZE);
private InfluxClient client; private final StorageBuilder<Record> storageBuilder;
private StorageBuilder<Record> storageBuilder;
public RecordDAO(InfluxClient client, StorageBuilder<Record> storageBuilder) { public RecordDAO(StorageBuilder<Record> storageBuilder) {
this.client = client;
this.storageBuilder = storageBuilder; this.storageBuilder = storageBuilder;
} }
@Override @Override
public InsertRequest prepareBatchInsert(Model model, Record record) throws IOException { public InsertRequest prepareBatchInsert(Model model, Record record) {
final long timestamp = TimeBucket.getTimestamp(record.getTimeBucket(), model.getDownsampling()) final long timestamp = TimeBucket.getTimestamp(record.getTimeBucket(), model.getDownsampling())
* PADDING_SIZE + SUFFIX.getAndIncrement(); * PADDING_SIZE
+ SUFFIX.getAndIncrement();
final InfluxInsertRequest request = new InfluxInsertRequest(model, record, storageBuilder) final InfluxInsertRequest request = new InfluxInsertRequest(model, record, storageBuilder)
.time(timestamp, TimeUnit.NANOSECONDS); .time(timestamp, TimeUnit.NANOSECONDS);
TableMetaInfo.get(model.getName()).getStorageAndTagMap().forEach((field, tag) -> {
request.addFieldAsTag(field, tag); TableMetaInfo.get(model.getName()).getStorageAndTagMap().forEach(request::addFieldAsTag);
});
if (SegmentRecord.INDEX_NAME.equals(model.getName())) { if (SegmentRecord.INDEX_NAME.equals(model.getName())) {
Map<String, List<SpanTag>> collect = ((SegmentRecord) record).getTagsRawData() Map<String, List<SpanTag>> collect = ((SegmentRecord) record).getTagsRawData()
.stream() .stream()
.collect( .collect(
Collectors.groupingBy(SpanTag::getKey)); Collectors.groupingBy(SpanTag::getKey));
collect.entrySet().forEach(e -> { collect.forEach((key, value) -> request.tag(
request.tag(e.getKey(), "'" + Joiner.on("'") key,
.join(e.getValue() "'" + Joiner.on("'").join(value.stream().map(SpanTag::getValue).collect(Collectors.toSet())) + "'"
.stream() ));
.map(SpanTag::getValue)
.collect(Collectors.toSet())) + "'");
});
} }
return request; return request;
} }
......
...@@ -44,7 +44,11 @@ import static org.influxdb.querybuilder.BuiltQuery.QueryBuilder.select; ...@@ -44,7 +44,11 @@ import static org.influxdb.querybuilder.BuiltQuery.QueryBuilder.select;
@Slf4j @Slf4j
public class AggregationQuery implements IAggregationQueryDAO { public class AggregationQuery implements IAggregationQueryDAO {
private InfluxClient client; private static final Comparator<SelectedRecord> ASCENDING =
Comparator.comparingLong(a -> Long.parseLong(a.getValue()));
private static final Comparator<SelectedRecord> DESCENDING = (a, b) ->
Long.compare(Long.parseLong(b.getValue()), Long.parseLong(a.getValue()));
private final InfluxClient client;
public AggregationQuery(InfluxClient client) { public AggregationQuery(InfluxClient client) {
this.client = client; this.client = client;
...@@ -72,11 +76,12 @@ public class AggregationQuery implements IAggregationQueryDAO { ...@@ -72,11 +76,12 @@ public class AggregationQuery implements IAggregationQueryDAO {
WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> where = select() WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> where = select()
.fromSubQuery(client.getDatabase()) .fromSubQuery(client.getDatabase())
.mean(valueColumnName) .mean(valueColumnName)
.from(condition.getName()).where(); .from(condition.getName())
.where();
if (additionalConditions != null) { if (additionalConditions != null) {
additionalConditions.forEach(moreCondition -> { additionalConditions.forEach(moreCondition ->
where.and(eq(moreCondition.getKey(), moreCondition.getValue())); where.and(eq(moreCondition.getKey(), moreCondition.getValue()))
}); );
} }
final SelectSubQueryImpl<SelectQueryImpl> subQuery = where final SelectSubQueryImpl<SelectQueryImpl> subQuery = where
.and(gte(InfluxClient.TIME, InfluxClient.timeIntervalTS(duration.getStartTimestamp()))) .and(gte(InfluxClient.TIME, InfluxClient.timeIntervalTS(duration.getStartTimestamp())))
...@@ -102,13 +107,8 @@ public class AggregationQuery implements IAggregationQueryDAO { ...@@ -102,13 +107,8 @@ public class AggregationQuery implements IAggregationQueryDAO {
entities.add(entity); entities.add(entity);
}); });
Collections.sort(entities, comparator); // re-sort by self, because of the result order by time. entities.sort(comparator); // re-sort by self, because of the result order by time.
return entities; return entities;
} }
private static final Comparator<SelectedRecord> ASCENDING = (a, b) -> Long.compare(
Long.parseLong(a.getValue()), Long.parseLong(b.getValue()));
private static final Comparator<SelectedRecord> DESCENDING = (a, b) -> Long.compare(
Long.parseLong(b.getValue()), Long.parseLong(a.getValue()));
} }
...@@ -73,9 +73,7 @@ public class AlarmQuery implements IAlarmQueryDAO { ...@@ -73,9 +73,7 @@ public class AlarmQuery implements IAlarmQueryDAO {
WhereQueryImpl<SelectQueryImpl> countQuery = select().count(AlarmRecord.ID0) WhereQueryImpl<SelectQueryImpl> countQuery = select().count(AlarmRecord.ID0)
.from(client.getDatabase(), AlarmRecord.INDEX_NAME) .from(client.getDatabase(), AlarmRecord.INDEX_NAME)
.where(); .where();
recallQuery.getClauses().forEach(clause -> { recallQuery.getClauses().forEach(countQuery::where);
countQuery.where(clause);
});
Query query = new Query(countQuery.getCommand() + recallQuery.getCommand()); Query query = new Query(countQuery.getCommand() + recallQuery.getCommand());
List<QueryResult.Result> results = client.query(query); List<QueryResult.Result> results = client.query(query);
......
...@@ -80,7 +80,7 @@ public class MetadataQuery implements IMetadataQueryDAO { ...@@ -80,7 +80,7 @@ public class MetadataQuery implements IMetadataQueryDAO {
@Override @Override
public List<Service> getAllBrowserServices() throws IOException { public List<Service> getAllBrowserServices() throws IOException {
WhereQueryImpl<SelectQueryImpl> query = select(ID_COLUMN, NAME, ServiceTraffic.GROUP) final WhereQueryImpl<SelectQueryImpl> query = select(ID_COLUMN, NAME, ServiceTraffic.GROUP)
.from(client.getDatabase(), ServiceTraffic.INDEX_NAME) .from(client.getDatabase(), ServiceTraffic.INDEX_NAME)
.where(eq(InfluxConstants.TagName.NODE_TYPE, String.valueOf(NodeType.Browser.value()))); .where(eq(InfluxConstants.TagName.NODE_TYPE, String.valueOf(NodeType.Browser.value())));
return buildServices(query); return buildServices(query);
...@@ -88,16 +88,16 @@ public class MetadataQuery implements IMetadataQueryDAO { ...@@ -88,16 +88,16 @@ public class MetadataQuery implements IMetadataQueryDAO {
@Override @Override
public List<Database> getAllDatabases() throws IOException { public List<Database> getAllDatabases() throws IOException {
WhereQueryImpl<SelectQueryImpl> query = select(ID_COLUMN, NAME, ServiceTraffic.GROUP) final WhereQueryImpl<SelectQueryImpl> query = select(ID_COLUMN, NAME, ServiceTraffic.GROUP)
.from(client.getDatabase(), ServiceTraffic.INDEX_NAME) .from(client.getDatabase(), ServiceTraffic.INDEX_NAME)
.where(eq(InfluxConstants.TagName.NODE_TYPE, String.valueOf(NodeType.Database.value()))); .where(eq(InfluxConstants.TagName.NODE_TYPE, String.valueOf(NodeType.Database.value())));
QueryResult.Series series = client.queryForSingleSeries(query); final QueryResult.Series series = client.queryForSingleSeries(query);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("SQL: {} result: {}", query.getCommand(), series); log.debug("SQL: {} result: {}", query.getCommand(), series);
} }
List<Database> databases = Lists.newArrayList(); final List<Database> databases = Lists.newArrayList();
if (Objects.nonNull(series)) { if (Objects.nonNull(series)) {
for (List<Object> values : series.getValues()) { for (List<Object> values : series.getValues()) {
Database database = new Database(); Database database = new Database();
...@@ -111,8 +111,7 @@ public class MetadataQuery implements IMetadataQueryDAO { ...@@ -111,8 +111,7 @@ public class MetadataQuery implements IMetadataQueryDAO {
@Override @Override
public List<Service> searchServices(String keyword) throws IOException { public List<Service> searchServices(String keyword) throws IOException {
final WhereQueryImpl<SelectQueryImpl> where = select( final WhereQueryImpl<SelectQueryImpl> where = select(ID_COLUMN, NAME, ServiceTraffic.GROUP)
ID_COLUMN, NAME, ServiceTraffic.GROUP)
.from(client.getDatabase(), ServiceTraffic.INDEX_NAME) .from(client.getDatabase(), ServiceTraffic.INDEX_NAME)
.where(eq(TagName.NODE_TYPE, String.valueOf(NodeType.Normal.value()))); .where(eq(TagName.NODE_TYPE, String.valueOf(NodeType.Normal.value())));
if (!Strings.isNullOrEmpty(keyword)) { if (!Strings.isNullOrEmpty(keyword)) {
...@@ -123,12 +122,11 @@ public class MetadataQuery implements IMetadataQueryDAO { ...@@ -123,12 +122,11 @@ public class MetadataQuery implements IMetadataQueryDAO {
@Override @Override
public Service searchService(String serviceCode) throws IOException { public Service searchService(String serviceCode) throws IOException {
WhereQueryImpl<SelectQueryImpl> where = select( final WhereQueryImpl<SelectQueryImpl> whereQuery = select(ID_COLUMN, NAME, ServiceTraffic.GROUP)
ID_COLUMN, NAME, ServiceTraffic.GROUP)
.from(client.getDatabase(), ServiceTraffic.INDEX_NAME) .from(client.getDatabase(), ServiceTraffic.INDEX_NAME)
.where(eq(TagName.NODE_TYPE, String.valueOf(NodeType.Normal.value()))) .where(eq(TagName.NODE_TYPE, String.valueOf(NodeType.Normal.value())));
.and(eq(ServiceTraffic.NAME, serviceCode)); whereQuery.and(eq(InfluxConstants.NAME, serviceCode));
return buildServices(where).get(0); return buildServices(whereQuery).get(0);
} }
@Override @Override
...@@ -147,7 +145,7 @@ public class MetadataQuery implements IMetadataQueryDAO { ...@@ -147,7 +145,7 @@ public class MetadataQuery implements IMetadataQueryDAO {
final QueryResult.Series series = client.queryForSingleSeries(where); final QueryResult.Series series = client.queryForSingleSeries(where);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("SQL: {} result: {}", where.getCommand(), series); log.debug("SQL: {} result: {}.", where.getCommand(), series);
} }
List<Endpoint> list = new ArrayList<>(limit); List<Endpoint> list = new ArrayList<>(limit);
...@@ -189,7 +187,7 @@ public class MetadataQuery implements IMetadataQueryDAO { ...@@ -189,7 +187,7 @@ public class MetadataQuery implements IMetadataQueryDAO {
} }
if (Objects.isNull(series)) { if (Objects.isNull(series)) {
return Collections.EMPTY_LIST; return Collections.emptyList();
} }
List<List<Object>> result = series.getValues(); List<List<Object>> result = series.getValues();
......
...@@ -63,22 +63,20 @@ public class MetricsQuery implements IMetricsQueryDAO { ...@@ -63,22 +63,20 @@ public class MetricsQuery implements IMetricsQueryDAO {
public long readMetricsValue(final MetricsCondition condition, public long readMetricsValue(final MetricsCondition condition,
final String valueColumnName, final String valueColumnName,
final Duration duration) throws IOException { final Duration duration) throws IOException {
int defaultValue = ValueColumnMetadata.INSTANCE.getDefaultValue(condition.getName()); final int defaultValue = ValueColumnMetadata.INSTANCE.getDefaultValue(condition.getName());
final Function function = ValueColumnMetadata.INSTANCE.getValueFunction(condition.getName()); final Function function = ValueColumnMetadata.INSTANCE.getValueFunction(condition.getName());
if (function == Function.Latest) { if (function == Function.Latest) {
return readMetricsValues(condition, valueColumnName, duration).getValues().latestValue(defaultValue); return readMetricsValues(condition, valueColumnName, duration).getValues().latestValue(defaultValue);
} }
final String measurement = condition.getName(); final String measurement = condition.getName();
SelectionQueryImpl query = select(); final SelectionQueryImpl query = select();
switch (function) { if (function == Function.Avg) {
case Avg: query.mean(valueColumnName);
query.mean(valueColumnName); } else {
break; query.sum(valueColumnName);
default:
query.sum(valueColumnName);
} }
WhereQueryImpl<SelectQueryImpl> queryWhereQuery = query.from(client.getDatabase(), measurement).where(); final WhereQueryImpl<SelectQueryImpl> queryWhereQuery = query.from(client.getDatabase(), measurement).where();
final String entityId = condition.getEntity().buildId(); final String entityId = condition.getEntity().buildId();
if (entityId != null) { if (entityId != null) {
...@@ -90,7 +88,7 @@ public class MetricsQuery implements IMetricsQueryDAO { ...@@ -90,7 +88,7 @@ public class MetricsQuery implements IMetricsQueryDAO {
.and(lte(InfluxClient.TIME, InfluxClient.timeIntervalTS(duration.getEndTimestamp()))) .and(lte(InfluxClient.TIME, InfluxClient.timeIntervalTS(duration.getEndTimestamp())))
.groupBy(InfluxConstants.TagName.ENTITY_ID); .groupBy(InfluxConstants.TagName.ENTITY_ID);
List<QueryResult.Series> seriesList = client.queryForSeries(queryWhereQuery); final List<QueryResult.Series> seriesList = client.queryForSeries(queryWhereQuery);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("SQL: {} result set: {}", queryWhereQuery.getCommand(), seriesList); log.debug("SQL: {} result set: {}", queryWhereQuery.getCommand(), seriesList);
} }
...@@ -109,12 +107,10 @@ public class MetricsQuery implements IMetricsQueryDAO { ...@@ -109,12 +107,10 @@ public class MetricsQuery implements IMetricsQueryDAO {
final String valueColumnName, final String valueColumnName,
final Duration duration) throws IOException { final Duration duration) throws IOException {
final List<PointOfTime> pointOfTimes = duration.assembleDurationPoints(); final List<PointOfTime> pointOfTimes = duration.assembleDurationPoints();
List<String> ids = new ArrayList<>(pointOfTimes.size()); final List<String> ids = new ArrayList<>(pointOfTimes.size());
pointOfTimes.forEach(pointOfTime -> { pointOfTimes.forEach(pointOfTime -> ids.add(pointOfTime.id(condition.getEntity().buildId())));
ids.add(pointOfTime.id(condition.getEntity().buildId()));
});
WhereQueryImpl<SelectQueryImpl> query = select() final WhereQueryImpl<SelectQueryImpl> query = select()
.column(ID_COLUMN) .column(ID_COLUMN)
.column(valueColumnName) .column(valueColumnName)
.from(client.getDatabase(), condition.getName()) .from(client.getDatabase(), condition.getName())
...@@ -156,12 +152,10 @@ public class MetricsQuery implements IMetricsQueryDAO { ...@@ -156,12 +152,10 @@ public class MetricsQuery implements IMetricsQueryDAO {
final List<String> labels, final List<String> labels,
final Duration duration) throws IOException { final Duration duration) throws IOException {
final List<PointOfTime> pointOfTimes = duration.assembleDurationPoints(); final List<PointOfTime> pointOfTimes = duration.assembleDurationPoints();
List<String> ids = new ArrayList<>(pointOfTimes.size()); final List<String> ids = new ArrayList<>(pointOfTimes.size());
pointOfTimes.forEach(pointOfTime -> { pointOfTimes.forEach(pointOfTime -> ids.add(pointOfTime.id(condition.getEntity().buildId())));
ids.add(pointOfTime.id(condition.getEntity().buildId()));
});
WhereQueryImpl<SelectQueryImpl> query = select() final WhereQueryImpl<SelectQueryImpl> query = select()
.column(ID_COLUMN) .column(ID_COLUMN)
.column(valueColumnName) .column(valueColumnName)
.from(client.getDatabase(), condition.getName()) .from(client.getDatabase(), condition.getName())
...@@ -174,12 +168,12 @@ public class MetricsQuery implements IMetricsQueryDAO { ...@@ -174,12 +168,12 @@ public class MetricsQuery implements IMetricsQueryDAO {
query.where(contains(ID_COLUMN, Joiner.on("|").join(ids))); query.where(contains(ID_COLUMN, Joiner.on("|").join(ids)));
} }
} }
List<QueryResult.Series> series = client.queryForSeries(query); final List<QueryResult.Series> series = client.queryForSeries(query);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("SQL: {} result set: {}", query.getCommand(), series); log.debug("SQL: {} result set: {}", query.getCommand(), series);
} }
Map<String, DataTable> idMap = new HashMap<>(); final Map<String, DataTable> idMap = new HashMap<>();
if (!CollectionUtils.isEmpty(series)) { if (!CollectionUtils.isEmpty(series)) {
series.get(0).getValues().forEach(values -> { series.get(0).getValues().forEach(values -> {
final String id = (String) values.get(1); final String id = (String) values.get(1);
...@@ -196,26 +190,23 @@ public class MetricsQuery implements IMetricsQueryDAO { ...@@ -196,26 +190,23 @@ public class MetricsQuery implements IMetricsQueryDAO {
final String valueColumnName, final String valueColumnName,
final Duration duration) throws IOException { final Duration duration) throws IOException {
final List<PointOfTime> pointOfTimes = duration.assembleDurationPoints(); final List<PointOfTime> pointOfTimes = duration.assembleDurationPoints();
List<String> ids = new ArrayList<>(pointOfTimes.size()); final List<String> ids = new ArrayList<>(pointOfTimes.size());
pointOfTimes.forEach(pointOfTime -> { pointOfTimes.forEach(pointOfTime -> ids.add(pointOfTime.id(condition.getEntity().buildId())));
ids.add(pointOfTime.id(condition.getEntity().buildId()));
});
WhereQueryImpl<SelectQueryImpl> query = select() final WhereQueryImpl<SelectQueryImpl> query = select()
.column(ID_COLUMN) .column(ID_COLUMN)
.column(valueColumnName) .column(valueColumnName)
.from(client.getDatabase(), condition.getName()) .from(client.getDatabase(), condition.getName())
.where(contains(ID_COLUMN, Joiner.on("|").join(ids))); .where(contains(ID_COLUMN, Joiner.on("|").join(ids)));
Map<String, List<Long>> thermodynamicValueMatrix = new HashMap<>();
QueryResult.Series series = client.queryForSingleSeries(query); final QueryResult.Series series = client.queryForSingleSeries(query);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("SQL: {} result set: {}", query.getCommand(), series); log.debug("SQL: {} result set: {}", query.getCommand(), series);
} }
final int defaultValue = ValueColumnMetadata.INSTANCE.getDefaultValue(condition.getName()); final int defaultValue = ValueColumnMetadata.INSTANCE.getDefaultValue(condition.getName());
HeatMap heatMap = new HeatMap(); final HeatMap heatMap = new HeatMap();
if (series != null) { if (series != null) {
for (List<Object> values : series.getValues()) { for (List<Object> values : series.getValues()) {
heatMap.buildColumn(values.get(1).toString(), values.get(2).toString(), defaultValue); heatMap.buildColumn(values.get(1).toString(), values.get(2).toString(), defaultValue);
...@@ -223,7 +214,6 @@ public class MetricsQuery implements IMetricsQueryDAO { ...@@ -223,7 +214,6 @@ public class MetricsQuery implements IMetricsQueryDAO {
} }
heatMap.fixMissingColumns(ids, defaultValue); heatMap.fixMissingColumns(ids, defaultValue);
return heatMap; return heatMap;
} }
} }
\ No newline at end of file
...@@ -40,7 +40,7 @@ import static org.influxdb.querybuilder.BuiltQuery.QueryBuilder.select; ...@@ -40,7 +40,7 @@ import static org.influxdb.querybuilder.BuiltQuery.QueryBuilder.select;
@Slf4j @Slf4j
public class NetworkAddressAliasDAO implements INetworkAddressAliasDAO { public class NetworkAddressAliasDAO implements INetworkAddressAliasDAO {
private final NetworkAddressAlias.Builder builder = new NetworkAddressAlias.Builder(); private final NetworkAddressAlias.Builder builder = new NetworkAddressAlias.Builder();
private InfluxClient client; private final InfluxClient client;
public NetworkAddressAliasDAO(final InfluxClient client) { public NetworkAddressAliasDAO(final InfluxClient client) {
this.client = client; this.client = client;
......
...@@ -37,8 +37,8 @@ import static org.influxdb.querybuilder.BuiltQuery.QueryBuilder.select; ...@@ -37,8 +37,8 @@ import static org.influxdb.querybuilder.BuiltQuery.QueryBuilder.select;
@Slf4j @Slf4j
public class ProfileTaskLogQuery implements IProfileTaskLogQueryDAO { public class ProfileTaskLogQuery implements IProfileTaskLogQueryDAO {
private InfluxClient client; private final InfluxClient client;
private int fetchTaskLogMaxSize; private final int fetchTaskLogMaxSize;
public ProfileTaskLogQuery(InfluxClient client, int fetchTaskLogMaxSize) { public ProfileTaskLogQuery(InfluxClient client, int fetchTaskLogMaxSize) {
this.client = client; this.client = client;
...@@ -68,16 +68,14 @@ public class ProfileTaskLogQuery implements IProfileTaskLogQueryDAO { ...@@ -68,16 +68,14 @@ public class ProfileTaskLogQuery implements IProfileTaskLogQueryDAO {
series.getValues().stream() series.getValues().stream()
// re-sort by self, because of the result order by time. // re-sort by self, because of the result order by time.
.sorted((a, b) -> Long.compare(((Number) b.get(1)).longValue(), ((Number) a.get(1)).longValue())) .sorted((a, b) -> Long.compare(((Number) b.get(1)).longValue(), ((Number) a.get(1)).longValue()))
.forEach(values -> { .forEach(values -> taskLogs.add(ProfileTaskLog.builder()
taskLogs.add(ProfileTaskLog.builder() .id((String) values.get(2))
.id((String) values.get(2)) .taskId((String) values.get(3))
.taskId((String) values.get(3)) .instanceId((String) values.get(4))
.instanceId((String) values.get(4)) .operationTime(((Number) values.get(5)).longValue())
.operationTime(((Number) values.get(5)).longValue()) .operationType(ProfileTaskLogOperationType.parse(
.operationType(ProfileTaskLogOperationType.parse( ((Number) values.get(6)).intValue()))
((Number) values.get(6)).intValue())) .build()));
.build());
});
return taskLogs; return taskLogs;
} }
} }
...@@ -52,7 +52,7 @@ public class ProfileTaskQuery implements IProfileTaskQueryDAO { ...@@ -52,7 +52,7 @@ public class ProfileTaskQuery implements IProfileTaskQueryDAO {
final Long startTimeBucket, final Long startTimeBucket,
final Long endTimeBucket, final Long endTimeBucket,
final Integer limit) throws IOException { final Integer limit) throws IOException {
WhereQueryImpl<SelectQueryImpl> query = final WhereQueryImpl<SelectQueryImpl> query =
select( select(
InfluxConstants.ID_COLUMN, InfluxConstants.ID_COLUMN,
ProfileTaskRecord.SERVICE_ID, ProfileTaskRecord.SERVICE_ID,
...@@ -83,15 +83,13 @@ public class ProfileTaskQuery implements IProfileTaskQueryDAO { ...@@ -83,15 +83,13 @@ public class ProfileTaskQuery implements IProfileTaskQueryDAO {
query.limit(limit); query.limit(limit);
} }
List<ProfileTask> tasks = Lists.newArrayList(); final List<ProfileTask> tasks = Lists.newArrayList();
QueryResult.Series series = client.queryForSingleSeries(query); QueryResult.Series series = client.queryForSingleSeries(query);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("SQL: {} result: {}", query.getCommand(), series); log.debug("SQL: {} result: {}", query.getCommand(), series);
} }
if (series != null) { if (series != null) {
series.getValues().forEach(values -> { series.getValues().forEach(values -> tasks.add(profileTaskBuilder(values)));
tasks.add(profileTaskBuilder(values));
});
} }
return tasks; return tasks;
} }
...@@ -101,7 +99,7 @@ public class ProfileTaskQuery implements IProfileTaskQueryDAO { ...@@ -101,7 +99,7 @@ public class ProfileTaskQuery implements IProfileTaskQueryDAO {
if (StringUtil.isEmpty(id)) { if (StringUtil.isEmpty(id)) {
return null; return null;
} }
SelectQueryImpl query = select( final SelectQueryImpl query = select(
InfluxConstants.ID_COLUMN, InfluxConstants.ID_COLUMN,
ProfileTaskRecord.SERVICE_ID, ProfileTaskRecord.SERVICE_ID,
ProfileTaskRecord.ENDPOINT_NAME, ProfileTaskRecord.ENDPOINT_NAME,
...@@ -117,7 +115,7 @@ public class ProfileTaskQuery implements IProfileTaskQueryDAO { ...@@ -117,7 +115,7 @@ public class ProfileTaskQuery implements IProfileTaskQueryDAO {
.and(eq(InfluxConstants.ID_COLUMN, id)) .and(eq(InfluxConstants.ID_COLUMN, id))
.limit(1); .limit(1);
QueryResult.Series series = client.queryForSingleSeries(query); final QueryResult.Series series = client.queryForSingleSeries(query);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("SQL: {} result: {}", query.getCommand(), series); log.debug("SQL: {} result: {}", query.getCommand(), series);
} }
...@@ -127,7 +125,7 @@ public class ProfileTaskQuery implements IProfileTaskQueryDAO { ...@@ -127,7 +125,7 @@ public class ProfileTaskQuery implements IProfileTaskQueryDAO {
return null; return null;
} }
private static final ProfileTask profileTaskBuilder(List<Object> values) { private static ProfileTask profileTaskBuilder(List<Object> values) {
return ProfileTask.builder() return ProfileTask.builder()
.id((String) values.get(1)) .id((String) values.get(1))
.serviceId((String) values.get(2)) .serviceId((String) values.get(2))
......
...@@ -38,6 +38,7 @@ import org.apache.skywalking.oap.server.storage.plugin.influxdb.InfluxClient; ...@@ -38,6 +38,7 @@ import org.apache.skywalking.oap.server.storage.plugin.influxdb.InfluxClient;
import org.apache.skywalking.oap.server.storage.plugin.influxdb.InfluxConstants; import org.apache.skywalking.oap.server.storage.plugin.influxdb.InfluxConstants;
import org.elasticsearch.common.Strings; import org.elasticsearch.common.Strings;
import org.influxdb.dto.QueryResult; import org.influxdb.dto.QueryResult;
import org.influxdb.querybuilder.SelectQueryImpl;
import org.influxdb.querybuilder.WhereQueryImpl; import org.influxdb.querybuilder.WhereQueryImpl;
import static org.influxdb.querybuilder.BuiltQuery.QueryBuilder.contains; import static org.influxdb.querybuilder.BuiltQuery.QueryBuilder.contains;
...@@ -56,26 +57,25 @@ public class ProfileThreadSnapshotQuery implements IProfileThreadSnapshotQueryDA ...@@ -56,26 +57,25 @@ public class ProfileThreadSnapshotQuery implements IProfileThreadSnapshotQueryDA
@Override @Override
public List<BasicTrace> queryProfiledSegments(String taskId) throws IOException { public List<BasicTrace> queryProfiledSegments(String taskId) throws IOException {
WhereQueryImpl query = select(ProfileThreadSnapshotRecord.SEGMENT_ID) final WhereQueryImpl<SelectQueryImpl> countQuery = select(ProfileThreadSnapshotRecord.SEGMENT_ID)
.from(client.getDatabase(), ProfileThreadSnapshotRecord.INDEX_NAME) .from(client.getDatabase(), ProfileThreadSnapshotRecord.INDEX_NAME)
.where() .where();
.and(eq(ProfileThreadSnapshotRecord.TASK_ID, taskId))
.and(eq(ProfileThreadSnapshotRecord.SEQUENCE, 0)); countQuery.and(eq(ProfileThreadSnapshotRecord.TASK_ID, taskId))
.and(eq(ProfileThreadSnapshotRecord.SEQUENCE, 0));
final LinkedList<String> segments = new LinkedList<>(); final LinkedList<String> segments = new LinkedList<>();
QueryResult.Series series = client.queryForSingleSeries(query); QueryResult.Series series = client.queryForSingleSeries(countQuery);
if (Objects.isNull(series)) { if (Objects.isNull(series)) {
return Collections.emptyList(); return Collections.emptyList();
} }
series.getValues().forEach(values -> { series.getValues().forEach(values -> segments.add((String) values.get(1)));
segments.add((String) values.get(1));
});
if (segments.isEmpty()) { if (segments.isEmpty()) {
return Collections.emptyList(); return Collections.emptyList();
} }
query = select() final WhereQueryImpl<SelectQueryImpl> whereQuery = select()
.function(InfluxConstants.SORT_ASC, SegmentRecord.START_TIME, segments.size()) .function(InfluxConstants.SORT_ASC, SegmentRecord.START_TIME, segments.size())
.column(SegmentRecord.SEGMENT_ID) .column(SegmentRecord.SEGMENT_ID)
.column(SegmentRecord.START_TIME) .column(SegmentRecord.START_TIME)
...@@ -84,16 +84,16 @@ public class ProfileThreadSnapshotQuery implements IProfileThreadSnapshotQueryDA ...@@ -84,16 +84,16 @@ public class ProfileThreadSnapshotQuery implements IProfileThreadSnapshotQueryDA
.column(SegmentRecord.IS_ERROR) .column(SegmentRecord.IS_ERROR)
.column(SegmentRecord.TRACE_ID) .column(SegmentRecord.TRACE_ID)
.from(client.getDatabase(), SegmentRecord.INDEX_NAME) .from(client.getDatabase(), SegmentRecord.INDEX_NAME)
.where() .where();
.and(contains(SegmentRecord.SEGMENT_ID, Joiner.on("|").join(segments))); whereQuery.and(contains(SegmentRecord.SEGMENT_ID, Joiner.on("|").join(segments)));
ArrayList<BasicTrace> result = Lists.newArrayListWithCapacity(segments.size()); ArrayList<BasicTrace> result = Lists.newArrayListWithCapacity(segments.size());
client.queryForSingleSeries(query) client.queryForSingleSeries(whereQuery)
.getValues() .getValues()
.stream() .stream()
.sorted((a, b) -> Long.compare(((Number) b.get(1)).longValue(), ((Number) a.get(1)).longValue())) .sorted((a, b) -> Long.compare(((Number) b.get(1)).longValue(), ((Number) a.get(1)).longValue()))
.forEach(values -> { .forEach(values -> {
BasicTrace basicTrace = new BasicTrace(); final BasicTrace basicTrace = new BasicTrace();
basicTrace.setSegmentId((String) values.get(2)); basicTrace.setSegmentId((String) values.get(2));
basicTrace.setStart(String.valueOf(((Number) values.get(3)).longValue())); basicTrace.setStart(String.valueOf(((Number) values.get(3)).longValue()));
...@@ -122,7 +122,7 @@ public class ProfileThreadSnapshotQuery implements IProfileThreadSnapshotQueryDA ...@@ -122,7 +122,7 @@ public class ProfileThreadSnapshotQuery implements IProfileThreadSnapshotQueryDA
@Override @Override
public List<ProfileThreadSnapshotRecord> queryRecords(String segmentId, int minSequence, public List<ProfileThreadSnapshotRecord> queryRecords(String segmentId, int minSequence,
int maxSequence) throws IOException { int maxSequence) throws IOException {
WhereQueryImpl query = select( WhereQueryImpl<SelectQueryImpl> whereQuery = select(
ProfileThreadSnapshotRecord.TASK_ID, ProfileThreadSnapshotRecord.TASK_ID,
ProfileThreadSnapshotRecord.SEGMENT_ID, ProfileThreadSnapshotRecord.SEGMENT_ID,
ProfileThreadSnapshotRecord.DUMP_TIME, ProfileThreadSnapshotRecord.DUMP_TIME,
...@@ -130,18 +130,19 @@ public class ProfileThreadSnapshotQuery implements IProfileThreadSnapshotQueryDA ...@@ -130,18 +130,19 @@ public class ProfileThreadSnapshotQuery implements IProfileThreadSnapshotQueryDA
ProfileThreadSnapshotRecord.STACK_BINARY ProfileThreadSnapshotRecord.STACK_BINARY
) )
.from(client.getDatabase(), ProfileThreadSnapshotRecord.INDEX_NAME) .from(client.getDatabase(), ProfileThreadSnapshotRecord.INDEX_NAME)
.where(eq(ProfileThreadSnapshotRecord.SEGMENT_ID, segmentId)) .where(eq(ProfileThreadSnapshotRecord.SEGMENT_ID, segmentId));
.and(gte(ProfileThreadSnapshotRecord.SEQUENCE, minSequence))
.and(lte(ProfileThreadSnapshotRecord.SEQUENCE, maxSequence)); whereQuery.and(gte(ProfileThreadSnapshotRecord.SEQUENCE, minSequence))
.and(lte(ProfileThreadSnapshotRecord.SEQUENCE, maxSequence));
QueryResult.Series series = client.queryForSingleSeries(query); final QueryResult.Series series = client.queryForSingleSeries(whereQuery);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("SQL: {} result: {}", query.getCommand(), series); log.debug("SQL: {} result: {}", whereQuery.getCommand(), series);
} }
if (Objects.isNull(series)) { if (Objects.isNull(series)) {
return Collections.EMPTY_LIST; return Collections.emptyList();
} }
ArrayList<ProfileThreadSnapshotRecord> result = new ArrayList<>(maxSequence - minSequence); final ArrayList<ProfileThreadSnapshotRecord> result = new ArrayList<>(maxSequence - minSequence);
series.getValues().forEach(values -> { series.getValues().forEach(values -> {
ProfileThreadSnapshotRecord record = new ProfileThreadSnapshotRecord(); ProfileThreadSnapshotRecord record = new ProfileThreadSnapshotRecord();
...@@ -162,29 +163,31 @@ public class ProfileThreadSnapshotQuery implements IProfileThreadSnapshotQueryDA ...@@ -162,29 +163,31 @@ public class ProfileThreadSnapshotQuery implements IProfileThreadSnapshotQueryDA
@Override @Override
public SegmentRecord getProfiledSegment(String segmentId) throws IOException { public SegmentRecord getProfiledSegment(String segmentId) throws IOException {
WhereQueryImpl query = select().column(SegmentRecord.SEGMENT_ID) WhereQueryImpl<SelectQueryImpl> whereQuery = select()
.column(SegmentRecord.TRACE_ID) .column(SegmentRecord.SEGMENT_ID)
.column(SegmentRecord.SERVICE_ID) .column(SegmentRecord.TRACE_ID)
.column(SegmentRecord.ENDPOINT_NAME) .column(SegmentRecord.SERVICE_ID)
.column(SegmentRecord.START_TIME) .column(SegmentRecord.ENDPOINT_NAME)
.column(SegmentRecord.END_TIME) .column(SegmentRecord.START_TIME)
.column(SegmentRecord.LATENCY) .column(SegmentRecord.END_TIME)
.column(SegmentRecord.IS_ERROR) .column(SegmentRecord.LATENCY)
.column(SegmentRecord.DATA_BINARY) .column(SegmentRecord.IS_ERROR)
.column(SegmentRecord.VERSION) .column(SegmentRecord.DATA_BINARY)
.from(client.getDatabase(), SegmentRecord.INDEX_NAME) .column(SegmentRecord.VERSION)
.where() .from(client.getDatabase(), SegmentRecord.INDEX_NAME)
.and(eq(SegmentRecord.SEGMENT_ID, segmentId)); .where();
List<QueryResult.Series> series = client.queryForSeries(query);
whereQuery.and(eq(SegmentRecord.SEGMENT_ID, segmentId));
List<QueryResult.Series> series = client.queryForSeries(whereQuery);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("SQL: {} result set: {}", query.getCommand(), series); log.debug("SQL: {} result set: {}", whereQuery.getCommand(), series);
} }
if (Objects.isNull(series) || series.isEmpty()) { if (Objects.isNull(series) || series.isEmpty()) {
return null; return null;
} }
List<Object> values = series.get(0).getValues().get(0); final List<Object> values = series.get(0).getValues().get(0);
SegmentRecord segmentRecord = new SegmentRecord(); final SegmentRecord segmentRecord = new SegmentRecord();
segmentRecord.setSegmentId((String) values.get(1)); segmentRecord.setSegmentId((String) values.get(1));
segmentRecord.setTraceId((String) values.get(2)); segmentRecord.setTraceId((String) values.get(2));
...@@ -196,7 +199,7 @@ public class ProfileThreadSnapshotQuery implements IProfileThreadSnapshotQueryDA ...@@ -196,7 +199,7 @@ public class ProfileThreadSnapshotQuery implements IProfileThreadSnapshotQueryDA
segmentRecord.setIsError(((Number) values.get(8)).intValue()); segmentRecord.setIsError(((Number) values.get(8)).intValue());
segmentRecord.setVersion(((Number) values.get(10)).intValue()); segmentRecord.setVersion(((Number) values.get(10)).intValue());
String base64 = (String) values.get(9); final String base64 = (String) values.get(9);
if (!Strings.isNullOrEmpty(base64)) { if (!Strings.isNullOrEmpty(base64)) {
segmentRecord.setDataBinary(Base64.getDecoder().decode(base64)); segmentRecord.setDataBinary(Base64.getDecoder().decode(base64));
} }
...@@ -205,13 +208,14 @@ public class ProfileThreadSnapshotQuery implements IProfileThreadSnapshotQueryDA ...@@ -205,13 +208,14 @@ public class ProfileThreadSnapshotQuery implements IProfileThreadSnapshotQueryDA
} }
private int querySequenceWithAgg(String function, String segmentId, long start, long end) throws IOException { private int querySequenceWithAgg(String function, String segmentId, long start, long end) throws IOException {
WhereQueryImpl query = select() WhereQueryImpl<SelectQueryImpl> query = select()
.function(function, ProfileThreadSnapshotRecord.SEQUENCE) .function(function, ProfileThreadSnapshotRecord.SEQUENCE)
.from(client.getDatabase(), ProfileThreadSnapshotRecord.INDEX_NAME) .from(client.getDatabase(), ProfileThreadSnapshotRecord.INDEX_NAME)
.where() .where();
.and(eq(ProfileThreadSnapshotRecord.SEGMENT_ID, segmentId))
.and(gte(ProfileThreadSnapshotRecord.DUMP_TIME, start)) query.and(eq(ProfileThreadSnapshotRecord.SEGMENT_ID, segmentId))
.and(lte(ProfileThreadSnapshotRecord.DUMP_TIME, end)); .and(gte(ProfileThreadSnapshotRecord.DUMP_TIME, start))
.and(lte(ProfileThreadSnapshotRecord.DUMP_TIME, end));
return client.getCounter(query); return client.getCounter(query);
} }
} }
...@@ -35,6 +35,7 @@ import org.apache.skywalking.oap.server.core.storage.query.ITopNRecordsQueryDAO; ...@@ -35,6 +35,7 @@ import org.apache.skywalking.oap.server.core.storage.query.ITopNRecordsQueryDAO;
import org.apache.skywalking.oap.server.storage.plugin.influxdb.InfluxClient; import org.apache.skywalking.oap.server.storage.plugin.influxdb.InfluxClient;
import org.apache.skywalking.oap.server.storage.plugin.influxdb.InfluxConstants; import org.apache.skywalking.oap.server.storage.plugin.influxdb.InfluxConstants;
import org.influxdb.dto.QueryResult; import org.influxdb.dto.QueryResult;
import org.influxdb.querybuilder.SelectQueryImpl;
import org.influxdb.querybuilder.WhereQueryImpl; import org.influxdb.querybuilder.WhereQueryImpl;
import static org.influxdb.querybuilder.BuiltQuery.QueryBuilder.eq; import static org.influxdb.querybuilder.BuiltQuery.QueryBuilder.eq;
...@@ -62,13 +63,14 @@ public class TopNRecordsQuery implements ITopNRecordsQueryDAO { ...@@ -62,13 +63,14 @@ public class TopNRecordsQuery implements ITopNRecordsQueryDAO {
comparator = DESCENDING; comparator = DESCENDING;
} }
WhereQueryImpl query = select() final WhereQueryImpl<SelectQueryImpl> query = select()
.function(function, valueColumnName, condition.getTopN()) .function(function, valueColumnName, condition.getTopN())
.column(TopN.STATEMENT) .column(TopN.STATEMENT)
.column(TopN.TRACE_ID) .column(TopN.TRACE_ID)
.from(client.getDatabase(), condition.getName()) .from(client.getDatabase(), condition.getName())
.where() .where();
.and(gte(TopN.TIME_BUCKET, duration.getStartTimeBucketInSec()))
query.and(gte(TopN.TIME_BUCKET, duration.getStartTimeBucketInSec()))
.and(lte(TopN.TIME_BUCKET, duration.getEndTimeBucketInSec())); .and(lte(TopN.TIME_BUCKET, duration.getEndTimeBucketInSec()));
if (StringUtil.isNotEmpty(condition.getParentService())) { if (StringUtil.isNotEmpty(condition.getParentService())) {
...@@ -94,12 +96,12 @@ public class TopNRecordsQuery implements ITopNRecordsQueryDAO { ...@@ -94,12 +96,12 @@ public class TopNRecordsQuery implements ITopNRecordsQueryDAO {
records.add(record); records.add(record);
}); });
Collections.sort(records, comparator); // re-sort by self, because of the result order by time. records.sort(comparator); // re-sort by self, because of the result order by time.
return records; return records;
} }
private static final Comparator<SelectedRecord> ASCENDING = (a, b) -> Long.compare( private static final Comparator<SelectedRecord> ASCENDING = Comparator.comparingLong(
Long.parseLong(a.getValue()), Long.parseLong(b.getValue())); a -> Long.parseLong(a.getValue()));
private static final Comparator<SelectedRecord> DESCENDING = (a, b) -> Long.compare( private static final Comparator<SelectedRecord> DESCENDING = (a, b) -> Long.compare(
Long.parseLong(b.getValue()), Long.parseLong(a.getValue())); Long.parseLong(b.getValue()), Long.parseLong(a.getValue()));
......
...@@ -54,11 +54,11 @@ public class TopologyQuery implements ITopologyQueryDAO { ...@@ -54,11 +54,11 @@ public class TopologyQuery implements ITopologyQueryDAO {
} }
@Override @Override
public List<Call.CallDetail> loadServiceRelationsDetectedAtServerSide(long startTB, public List<Call.CallDetail> loadServiceRelationsDetectedAtServerSide(final long startTB,
long endTB, final long endTB,
List<String> serviceIds) throws IOException { final List<String> serviceIds) throws IOException {
String measurement = ServiceRelationServerSideMetrics.INDEX_NAME; final String measurement = ServiceRelationServerSideMetrics.INDEX_NAME;
WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = buildServiceCallsQuery( final WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = buildServiceCallsQuery(
measurement, measurement,
startTB, startTB,
endTB, endTB,
...@@ -71,12 +71,11 @@ public class TopologyQuery implements ITopologyQueryDAO { ...@@ -71,12 +71,11 @@ public class TopologyQuery implements ITopologyQueryDAO {
} }
@Override @Override
public List<Call.CallDetail> loadServiceRelationDetectedAtClientSide(long startTB, public List<Call.CallDetail> loadServiceRelationDetectedAtClientSide(final long startTB,
long endTB, final long endTB,
List<String> serviceIds) throws IOException { List<String> serviceIds) throws IOException {
String measurement = ServiceRelationClientSideMetrics.INDEX_NAME; final WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = buildServiceCallsQuery(
WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = buildServiceCallsQuery( ServiceRelationClientSideMetrics.INDEX_NAME,
measurement,
startTB, startTB,
endTB, endTB,
ServiceRelationServerSideMetrics.SOURCE_SERVICE_ID, ServiceRelationServerSideMetrics.SOURCE_SERVICE_ID,
...@@ -87,11 +86,10 @@ public class TopologyQuery implements ITopologyQueryDAO { ...@@ -87,11 +86,10 @@ public class TopologyQuery implements ITopologyQueryDAO {
} }
@Override @Override
public List<Call.CallDetail> loadServiceRelationsDetectedAtServerSide(long startTB, public List<Call.CallDetail> loadServiceRelationsDetectedAtServerSide(final long startTB,
long endTB) throws IOException { final long endTB) throws IOException {
String measurement = ServiceRelationServerSideMetrics.INDEX_NAME; final WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = buildServiceCallsQuery(
WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = buildServiceCallsQuery( ServiceRelationServerSideMetrics.INDEX_NAME,
measurement,
startTB, startTB,
endTB, endTB,
ServiceRelationServerSideMetrics.SOURCE_SERVICE_ID, ServiceRelationServerSideMetrics.SOURCE_SERVICE_ID,
...@@ -102,11 +100,10 @@ public class TopologyQuery implements ITopologyQueryDAO { ...@@ -102,11 +100,10 @@ public class TopologyQuery implements ITopologyQueryDAO {
} }
@Override @Override
public List<Call.CallDetail> loadServiceRelationDetectedAtClientSide(long startTB, public List<Call.CallDetail> loadServiceRelationDetectedAtClientSide(final long startTB,
long endTB) throws IOException { final long endTB) throws IOException {
String tableName = ServiceRelationClientSideMetrics.INDEX_NAME;
WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = buildServiceCallsQuery( WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = buildServiceCallsQuery(
tableName, ServiceRelationClientSideMetrics.INDEX_NAME,
startTB, startTB,
endTB, endTB,
ServiceRelationServerSideMetrics.SOURCE_SERVICE_ID, ServiceRelationServerSideMetrics.SOURCE_SERVICE_ID,
...@@ -117,13 +114,12 @@ public class TopologyQuery implements ITopologyQueryDAO { ...@@ -117,13 +114,12 @@ public class TopologyQuery implements ITopologyQueryDAO {
} }
@Override @Override
public List<Call.CallDetail> loadInstanceRelationDetectedAtServerSide(String clientServiceId, public List<Call.CallDetail> loadInstanceRelationDetectedAtServerSide(final String clientServiceId,
String serverServiceId, final String serverServiceId,
long startTB, final long startTB,
long endTB) throws IOException { final long endTB) throws IOException {
String measurement = ServiceInstanceRelationServerSideMetrics.INDEX_NAME;
WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = buildServiceInstanceCallsQuery( WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = buildServiceInstanceCallsQuery(
measurement, ServiceInstanceRelationServerSideMetrics.INDEX_NAME,
startTB, startTB,
endTB, endTB,
ServiceInstanceRelationServerSideMetrics.SOURCE_SERVICE_ID, ServiceInstanceRelationServerSideMetrics.SOURCE_SERVICE_ID,
...@@ -134,13 +130,12 @@ public class TopologyQuery implements ITopologyQueryDAO { ...@@ -134,13 +130,12 @@ public class TopologyQuery implements ITopologyQueryDAO {
} }
@Override @Override
public List<Call.CallDetail> loadInstanceRelationDetectedAtClientSide(String clientServiceId, public List<Call.CallDetail> loadInstanceRelationDetectedAtClientSide(final String clientServiceId,
String serverServiceId, final String serverServiceId,
long startTB, final long startTB,
long endTB) throws IOException { final long endTB) throws IOException {
String measurement = ServiceInstanceRelationClientSideMetrics.INDEX_NAME;
WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = buildServiceInstanceCallsQuery( WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = buildServiceInstanceCallsQuery(
measurement, ServiceInstanceRelationClientSideMetrics.INDEX_NAME,
startTB, startTB,
endTB, endTB,
ServiceInstanceRelationClientSideMetrics.SOURCE_SERVICE_ID, ServiceInstanceRelationClientSideMetrics.SOURCE_SERVICE_ID,
...@@ -151,13 +146,11 @@ public class TopologyQuery implements ITopologyQueryDAO { ...@@ -151,13 +146,11 @@ public class TopologyQuery implements ITopologyQueryDAO {
} }
@Override @Override
public List<Call.CallDetail> loadEndpointRelation(long startTB, public List<Call.CallDetail> loadEndpointRelation(final long startTB,
long endTB, final long endTB,
String destEndpointId) throws IOException { final String destEndpointId) throws IOException {
String measurement = EndpointRelationServerSideMetrics.INDEX_NAME; final WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = buildServiceCallsQuery(
EndpointRelationServerSideMetrics.INDEX_NAME,
WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = buildServiceCallsQuery(
measurement,
startTB, startTB,
endTB, endTB,
EndpointRelationServerSideMetrics.SOURCE_ENDPOINT, EndpointRelationServerSideMetrics.SOURCE_ENDPOINT,
...@@ -166,8 +159,8 @@ public class TopologyQuery implements ITopologyQueryDAO { ...@@ -166,8 +159,8 @@ public class TopologyQuery implements ITopologyQueryDAO {
); );
subQuery.and(eq(EndpointRelationServerSideMetrics.DEST_ENDPOINT, destEndpointId)); subQuery.and(eq(EndpointRelationServerSideMetrics.DEST_ENDPOINT, destEndpointId));
WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery2 = buildServiceCallsQuery( final WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery2 = buildServiceCallsQuery(
measurement, EndpointRelationServerSideMetrics.INDEX_NAME,
startTB, startTB,
endTB, endTB,
EndpointRelationServerSideMetrics.SOURCE_ENDPOINT, EndpointRelationServerSideMetrics.SOURCE_ENDPOINT,
...@@ -176,19 +169,20 @@ public class TopologyQuery implements ITopologyQueryDAO { ...@@ -176,19 +169,20 @@ public class TopologyQuery implements ITopologyQueryDAO {
); );
subQuery2.and(eq(EndpointRelationServerSideMetrics.SOURCE_ENDPOINT, destEndpointId)); subQuery2.and(eq(EndpointRelationServerSideMetrics.SOURCE_ENDPOINT, destEndpointId));
List<Call.CallDetail> calls = buildEndpointCalls(buildQuery(subQuery), DetectPoint.SERVER); final List<Call.CallDetail> calls = buildEndpointCalls(buildQuery(subQuery), DetectPoint.SERVER);
calls.addAll(buildEndpointCalls(buildQuery(subQuery), DetectPoint.CLIENT)); calls.addAll(buildEndpointCalls(buildQuery(subQuery), DetectPoint.CLIENT));
return calls; return calls;
} }
private WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> buildServiceCallsQuery( private WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> buildServiceCallsQuery(
String measurement, final String measurement,
long startTB, final long startTB,
long endTB, final long endTB,
String sourceCName, final String sourceCName,
String destCName, final String destCName,
List<String> serviceIds) { final List<String> serviceIds) {
WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = select()
final WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = select()
.fromSubQuery(client.getDatabase()) .fromSubQuery(client.getDatabase())
.function("distinct", ServiceInstanceRelationServerSideMetrics.COMPONENT_ID) .function("distinct", ServiceInstanceRelationServerSideMetrics.COMPONENT_ID)
.as(ServiceInstanceRelationClientSideMetrics.COMPONENT_ID) .as(ServiceInstanceRelationClientSideMetrics.COMPONENT_ID)
...@@ -198,7 +192,8 @@ public class TopologyQuery implements ITopologyQueryDAO { ...@@ -198,7 +192,8 @@ public class TopologyQuery implements ITopologyQueryDAO {
.and(lte(InfluxClient.TIME, InfluxClient.timeIntervalTB(endTB))); .and(lte(InfluxClient.TIME, InfluxClient.timeIntervalTB(endTB)));
if (!serviceIds.isEmpty()) { if (!serviceIds.isEmpty()) {
WhereNested whereNested = subQuery.andNested(); WhereNested<WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl>> whereNested = subQuery
.andNested();
for (String id : serviceIds) { for (String id : serviceIds) {
whereNested.or(eq(sourceCName, id)) whereNested.or(eq(sourceCName, id))
.or(eq(destCName, id)); .or(eq(destCName, id));
...@@ -209,24 +204,25 @@ public class TopologyQuery implements ITopologyQueryDAO { ...@@ -209,24 +204,25 @@ public class TopologyQuery implements ITopologyQueryDAO {
} }
private WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> buildServiceInstanceCallsQuery( private WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> buildServiceInstanceCallsQuery(
String measurement, final String measurement,
long startTB, final long startTB,
long endTB, final long endTB,
String sourceCName, final String sourceCName,
String destCName, final String destCName,
String sourceServiceId, final String sourceServiceId,
String destServiceId) { final String destServiceId) {
WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = select() final WhereSubQueryImpl<SelectSubQueryImpl<SelectQueryImpl>, SelectQueryImpl> subQuery = select()
.fromSubQuery(client.getDatabase()) .fromSubQuery(client.getDatabase())
.function("distinct", ServiceInstanceRelationServerSideMetrics.COMPONENT_ID) .function("distinct", ServiceInstanceRelationServerSideMetrics.COMPONENT_ID)
.as(ServiceInstanceRelationClientSideMetrics.COMPONENT_ID) .as(ServiceInstanceRelationClientSideMetrics.COMPONENT_ID)
.from(measurement) .from(measurement)
.where() .where();
.and(gte(InfluxClient.TIME, InfluxClient.timeIntervalTB(startTB)))
.and(lte(InfluxClient.TIME, InfluxClient.timeIntervalTB(endTB))); subQuery.and(gte(InfluxClient.TIME, InfluxClient.timeIntervalTB(startTB)))
.and(lte(InfluxClient.TIME, InfluxClient.timeIntervalTB(endTB)));
StringBuilder builder = new StringBuilder("(("); final StringBuilder builder = new StringBuilder("((");
builder.append(sourceCName).append("='").append(sourceServiceId) builder.append(sourceCName).append("='").append(sourceServiceId)
.append("' and ") .append("' and ")
.append(destCName).append("='").append(destServiceId) .append(destCName).append("='").append(destServiceId)
...@@ -242,7 +238,7 @@ public class TopologyQuery implements ITopologyQueryDAO { ...@@ -242,7 +238,7 @@ public class TopologyQuery implements ITopologyQueryDAO {
private List<Call.CallDetail> buildServiceCalls(Query query, private List<Call.CallDetail> buildServiceCalls(Query query,
DetectPoint detectPoint) throws IOException { DetectPoint detectPoint) throws IOException {
QueryResult.Series series = client.queryForSingleSeries(query); final QueryResult.Series series = client.queryForSingleSeries(query);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("SQL: {} result set: {}", query.getCommand(), series); log.debug("SQL: {} result set: {}", query.getCommand(), series);
...@@ -251,7 +247,7 @@ public class TopologyQuery implements ITopologyQueryDAO { ...@@ -251,7 +247,7 @@ public class TopologyQuery implements ITopologyQueryDAO {
return Collections.emptyList(); return Collections.emptyList();
} }
List<Call.CallDetail> calls = new ArrayList<>(); final List<Call.CallDetail> calls = new ArrayList<>();
series.getValues().forEach(values -> { series.getValues().forEach(values -> {
Call.CallDetail call = new Call.CallDetail(); Call.CallDetail call = new Call.CallDetail();
String entityId = String.valueOf(values.get(1)); String entityId = String.valueOf(values.get(1));
...@@ -270,8 +266,7 @@ public class TopologyQuery implements ITopologyQueryDAO { ...@@ -270,8 +266,7 @@ public class TopologyQuery implements ITopologyQueryDAO {
return query; return query;
} }
private List<Call.CallDetail> buildInstanceCalls(Query query, private List<Call.CallDetail> buildInstanceCalls(Query query, DetectPoint detectPoint) throws IOException {
DetectPoint detectPoint) throws IOException {
QueryResult.Series series = client.queryForSingleSeries(query); QueryResult.Series series = client.queryForSingleSeries(query);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
......
...@@ -178,23 +178,24 @@ public class TraceQuery implements ITraceQueryDAO { ...@@ -178,23 +178,24 @@ public class TraceQuery implements ITraceQueryDAO {
@Override @Override
public List<SegmentRecord> queryByTraceId(String traceId) throws IOException { public List<SegmentRecord> queryByTraceId(String traceId) throws IOException {
WhereQueryImpl query = select().column(SegmentRecord.SEGMENT_ID) WhereQueryImpl<SelectQueryImpl> whereQuery = select().column(SegmentRecord.SEGMENT_ID)
.column(SegmentRecord.TRACE_ID) .column(SegmentRecord.TRACE_ID)
.column(SegmentRecord.SERVICE_ID) .column(SegmentRecord.SERVICE_ID)
.column(SegmentRecord.SERVICE_INSTANCE_ID) .column(SegmentRecord.SERVICE_INSTANCE_ID)
.column(SegmentRecord.ENDPOINT_NAME) .column(SegmentRecord.ENDPOINT_NAME)
.column(SegmentRecord.START_TIME) .column(SegmentRecord.START_TIME)
.column(SegmentRecord.END_TIME) .column(SegmentRecord.END_TIME)
.column(SegmentRecord.LATENCY) .column(SegmentRecord.LATENCY)
.column(SegmentRecord.IS_ERROR) .column(SegmentRecord.IS_ERROR)
.column(SegmentRecord.DATA_BINARY) .column(SegmentRecord.DATA_BINARY)
.column(SegmentRecord.VERSION) .column(SegmentRecord.VERSION)
.from(client.getDatabase(), SegmentRecord.INDEX_NAME) .from(client.getDatabase(), SegmentRecord.INDEX_NAME)
.where() .where();
.and(eq(SegmentRecord.TRACE_ID, traceId));
List<QueryResult.Series> series = client.queryForSeries(query); whereQuery.and(eq(SegmentRecord.TRACE_ID, traceId));
List<QueryResult.Series> series = client.queryForSeries(whereQuery);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("SQL: {} result set: {}", query.getCommand(), series); log.debug("SQL: {} result set: {}", whereQuery.getCommand(), series);
} }
if (series == null || series.isEmpty()) { if (series == null || series.isEmpty()) {
return Collections.emptyList(); return Collections.emptyList();
...@@ -226,7 +227,7 @@ public class TraceQuery implements ITraceQueryDAO { ...@@ -226,7 +227,7 @@ public class TraceQuery implements ITraceQueryDAO {
} }
@Override @Override
public List<Span> doFlexibleTraceQuery(String traceId) throws IOException { public List<Span> doFlexibleTraceQuery(String traceId) {
return Collections.emptyList(); return Collections.emptyList();
} }
} }
...@@ -50,9 +50,9 @@ public class UITemplateManagementDAOImpl implements UITemplateManagementDAO { ...@@ -50,9 +50,9 @@ public class UITemplateManagementDAOImpl implements UITemplateManagementDAO {
@Override @Override
public List<DashboardConfiguration> getAllTemplates(final Boolean includingDisabled) throws IOException { public List<DashboardConfiguration> getAllTemplates(final Boolean includingDisabled) throws IOException {
WhereQueryImpl<SelectQueryImpl> where = select().raw("*::field") final WhereQueryImpl<SelectQueryImpl> where = select().raw("*::field")
.from(client.getDatabase(), UITemplate.INDEX_NAME) .from(client.getDatabase(), UITemplate.INDEX_NAME)
.where(); .where();
if (!includingDisabled) { if (!includingDisabled) {
where.and(eq(UITemplate.DISABLED, BooleanUtils.FALSE)); where.and(eq(UITemplate.DISABLED, BooleanUtils.FALSE));
} }
...@@ -82,11 +82,11 @@ public class UITemplateManagementDAOImpl implements UITemplateManagementDAO { ...@@ -82,11 +82,11 @@ public class UITemplateManagementDAOImpl implements UITemplateManagementDAO {
final UITemplate.Builder builder = new UITemplate.Builder(); final UITemplate.Builder builder = new UITemplate.Builder();
final UITemplate uiTemplate = setting.toEntity(); final UITemplate uiTemplate = setting.toEntity();
Point point = Point.measurement(UITemplate.INDEX_NAME) final Point point = Point.measurement(UITemplate.INDEX_NAME)
.tag(InfluxConstants.TagName.ID_COLUMN, uiTemplate.id()) .tag(InfluxConstants.TagName.ID_COLUMN, uiTemplate.id())
.fields(builder.data2Map(uiTemplate)) .fields(builder.data2Map(uiTemplate))
.time(1L, TimeUnit.NANOSECONDS) .time(1L, TimeUnit.NANOSECONDS)
.build(); .build();
client.write(point); client.write(point);
return TemplateChangeStatus.builder().status(true).build(); return TemplateChangeStatus.builder().status(true).build();
} }
...@@ -102,11 +102,11 @@ public class UITemplateManagementDAOImpl implements UITemplateManagementDAO { ...@@ -102,11 +102,11 @@ public class UITemplateManagementDAOImpl implements UITemplateManagementDAO {
QueryResult.Series series = client.queryForSingleSeries(query); QueryResult.Series series = client.queryForSingleSeries(query);
if (Objects.nonNull(series)) { if (Objects.nonNull(series)) {
Point point = Point.measurement(UITemplate.INDEX_NAME) final Point point = Point.measurement(UITemplate.INDEX_NAME)
.fields(builder.data2Map(uiTemplate)) .fields(builder.data2Map(uiTemplate))
.tag(InfluxConstants.TagName.ID_COLUMN, uiTemplate.id()) .tag(InfluxConstants.TagName.ID_COLUMN, uiTemplate.id())
.time(1L, TimeUnit.NANOSECONDS) .time(1L, TimeUnit.NANOSECONDS)
.build(); .build();
client.write(point); client.write(point);
return TemplateChangeStatus.builder().status(true).build(); return TemplateChangeStatus.builder().status(true).build();
} else { } else {
...@@ -121,11 +121,11 @@ public class UITemplateManagementDAOImpl implements UITemplateManagementDAO { ...@@ -121,11 +121,11 @@ public class UITemplateManagementDAOImpl implements UITemplateManagementDAO {
.where(eq(InfluxConstants.TagName.ID_COLUMN, name)); .where(eq(InfluxConstants.TagName.ID_COLUMN, name));
QueryResult.Series series = client.queryForSingleSeries(query); QueryResult.Series series = client.queryForSingleSeries(query);
if (Objects.nonNull(series)) { if (Objects.nonNull(series)) {
Point point = Point.measurement(UITemplate.INDEX_NAME) final Point point = Point.measurement(UITemplate.INDEX_NAME)
.tag(InfluxConstants.TagName.ID_COLUMN, name) .tag(InfluxConstants.TagName.ID_COLUMN, name)
.addField(UITemplate.DISABLED, BooleanUtils.TRUE) .addField(UITemplate.DISABLED, BooleanUtils.TRUE)
.time(1L, TimeUnit.NANOSECONDS) .time(1L, TimeUnit.NANOSECONDS)
.build(); .build();
client.write(point); client.write(point);
return TemplateChangeStatus.builder().status(true).build(); return TemplateChangeStatus.builder().status(true).build();
} else { } else {
......
...@@ -26,7 +26,7 @@ import org.apache.skywalking.oap.server.library.module.ModuleConfig; ...@@ -26,7 +26,7 @@ import org.apache.skywalking.oap.server.library.module.ModuleConfig;
@Getter @Getter
public class H2StorageConfig extends ModuleConfig { public class H2StorageConfig extends ModuleConfig {
private String driver = "org.h2.jdbcx.JdbcDataSource"; private String driver = "org.h2.jdbcx.JdbcDataSource";
private String url = "jdbc:h2:mem:skywalking-oap-db"; private String url = "jdbc:h2:mem:skywalking-oap-db;DB_CLOSE_DELAY=-1";
private String user = ""; private String user = "";
private String password = ""; private String password = "";
private int metadataQueryMaxSize = 5000; private int metadataQueryMaxSize = 5000;
......
...@@ -189,7 +189,7 @@ ...@@ -189,7 +189,7 @@
<lombok.version>1.18.10</lombok.version> <lombok.version>1.18.10</lombok.version>
<!-- core lib dependency --> <!-- core lib dependency -->
<bytebuddy.version>1.10.16</bytebuddy.version> <bytebuddy.version>1.10.19</bytebuddy.version>
<grpc.version>1.32.1</grpc.version> <grpc.version>1.32.1</grpc.version>
<gson.version>2.8.6</gson.version> <gson.version>2.8.6</gson.version>
<os-maven-plugin.version>1.6.2</os-maven-plugin.version> <os-maven-plugin.version>1.6.2</os-maven-plugin.version>
......
Subproject commit 76c7d0204b2fae6a69d6154ccc052fe69a6cdf67 Subproject commit 075b29fce4f9a1aa50165385f0b30e61dfb9896d
...@@ -55,7 +55,8 @@ public class TopoMatcher extends AbstractMatcher<Topology> { ...@@ -55,7 +55,8 @@ public class TopoMatcher extends AbstractMatcher<Topology> {
try { try {
getNodes().get(i).verify(topology.getNodes().get(j)); getNodes().get(i).verify(topology.getNodes().get(j));
matched = true; matched = true;
} catch (Throwable ignored) { } catch (Throwable e) {
e.printStackTrace();
} }
} }
if (!matched) { if (!matched) {
...@@ -71,7 +72,8 @@ public class TopoMatcher extends AbstractMatcher<Topology> { ...@@ -71,7 +72,8 @@ public class TopoMatcher extends AbstractMatcher<Topology> {
try { try {
getCalls().get(i).verify(topology.getCalls().get(j)); getCalls().get(i).verify(topology.getCalls().get(j));
matched = true; matched = true;
} catch (Throwable ignored) { } catch (Throwable e) {
e.printStackTrace();
} }
} }
if (!matched) { if (!matched) {
......
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
FROM node:12
ENV COMMIT_HASH=55daa7e6385b6236987a5e26bb39242724730455
WORKDIR /app
EXPOSE 5050 5051
RUN git clone https://github.com/apache/skywalking-nodejs.git $(pwd)
RUN git reset --hard ${COMMIT_HASH} && git submodule update --init
RUN npm install
RUN npm run generate-source
RUN npm install express axios
/*!
*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
import * as http from 'http';
import agent from './src';
import axios from 'axios';
agent.start({
serviceName: 'consumer',
maxBufferSize: 1000,
});
const server = http.createServer((req, res) => {
axios
.post(`http://${process.env.SERVER || 'localhost:5000'}${req.url}`, {}, {
headers: {
'Content-Type': 'application/json'
}
})
.then((r) => res.end(JSON.stringify(r.data)))
.catch(err => res.end(JSON.stringify(err.message)));
});
server.listen(5001, () => console.info('Listening on port 5001...'));
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
version: '2.1'
services:
oap:
extends:
file: ../base-compose.yml
service: oap
ui:
extends:
file: ../base-compose.yml
service: ui
depends_on:
oap:
condition: service_healthy
provider:
build:
context: .
dockerfile: Dockerfile.nodejs
networks:
- e2e
expose:
- 5000
environment:
SW_AGENT_COLLECTOR_BACKEND_SERVICES: oap:11800
SW_AGENT_INSTANCE: provider-instance
volumes:
- ./provider.ts:/app/provider.ts
depends_on:
oap:
condition: service_healthy
healthcheck:
test: [ "CMD", "bash", "-c", "cat < /dev/null > /dev/tcp/127.0.0.1/5000" ]
interval: 5s
timeout: 60s
retries: 120
entrypoint: [ 'npx', 'ts-node', '/app/provider.ts' ]
medium:
extends:
file: ../base-compose.yml
service: consumer
environment:
PROVIDER_URL: http://provider:5000
depends_on:
oap:
condition: service_healthy
provider:
condition: service_healthy
consumer:
build:
context: .
dockerfile: Dockerfile.nodejs
networks:
- e2e
expose:
- 5001
environment:
SW_AGENT_COLLECTOR_BACKEND_SERVICES: oap:11800
SW_AGENT_INSTANCE: consumer-instance
SERVER: medium:9092
volumes:
- ./consumer.ts:/app/consumer.ts
depends_on:
oap:
condition: service_healthy
medium:
condition: service_healthy
healthcheck:
test: [ "CMD", "bash", "-c", "cat < /dev/null > /dev/tcp/127.0.0.1/5051" ]
interval: 5s
timeout: 60s
retries: 120
entrypoint: [ 'npx', 'ts-node', '/app/consumer.ts' ]
networks:
e2e:
/*!
*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
import agent from './src';
import express from 'express';
agent.start({
serviceName: 'provider',
maxBufferSize: 1000,
});
const app = express();
const handle = (req, res) => setTimeout(() => res.send({'id': 1, 'name': 'ke'}), 200);
app.get('/users', handle);
app.post('/users', handle);
app.listen(5000, () => console.info('Listening on port 5000...'));
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
version: '2.1'
services:
oap:
environment:
SW_PROMETHEUS_FETCHER: "default"
SW_TELEMETRY: prometheus
extends:
file: ../../base-compose.yml
service: oap
ui:
extends:
file: ../../base-compose.yml
service: ui
depends_on:
oap:
condition: service_healthy
provider:
build:
context: ../../../..
dockerfile: e2e-test/docker/Dockerfile.provider
args:
- SW_AGENT_JDK_VERSION=${SW_AGENT_JDK_VERSION:-11}
- DIST_PACKAGE=http://archive.apache.org/dist/skywalking/8.3.0/apache-skywalking-apm-8.3.0.tar.gz
networks:
- e2e
expose:
- 9090
- 5005
volumes:
- ../../../jacoco:/jacoco
environment:
SW_AGENT_COLLECTOR_BACKEND_SERVICES: oap:11800
JAVA_OPTS: -javaagent:/skywalking/agent/skywalking-agent.jar=logging.output=CONSOLE
healthcheck:
test: ["CMD", "sh", "-c", "nc -nz 127.0.0.1 9090"]
interval: 5s
timeout: 60s
retries: 120
depends_on:
oap:
condition: service_healthy
networks:
e2e:
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.e2e;
import java.util.List;
import java.util.stream.Collectors;
import lombok.extern.slf4j.Slf4j;
import org.apache.skywalking.e2e.annotation.ContainerHostAndPort;
import org.apache.skywalking.e2e.annotation.DockerCompose;
import org.apache.skywalking.e2e.base.SkyWalkingE2E;
import org.apache.skywalking.e2e.base.SkyWalkingTestAdapter;
import org.apache.skywalking.e2e.common.HostAndPort;
import org.apache.skywalking.e2e.metrics.AtLeastOneOfMetricsMatcher;
import org.apache.skywalking.e2e.metrics.Metrics;
import org.apache.skywalking.e2e.metrics.MetricsQuery;
import org.apache.skywalking.e2e.metrics.MetricsValueMatcher;
import org.apache.skywalking.e2e.retryable.RetryableTest;
import org.apache.skywalking.e2e.service.Service;
import org.apache.skywalking.e2e.service.ServicesMatcher;
import org.apache.skywalking.e2e.service.ServicesQuery;
import org.apache.skywalking.e2e.service.endpoint.Endpoint;
import org.apache.skywalking.e2e.service.endpoint.EndpointQuery;
import org.apache.skywalking.e2e.service.endpoint.Endpoints;
import org.apache.skywalking.e2e.service.endpoint.EndpointsMatcher;
import org.apache.skywalking.e2e.service.instance.Instance;
import org.apache.skywalking.e2e.service.instance.Instances;
import org.apache.skywalking.e2e.service.instance.InstancesMatcher;
import org.apache.skywalking.e2e.service.instance.InstancesQuery;
import org.apache.skywalking.e2e.topo.Call;
import org.apache.skywalking.e2e.topo.ServiceInstanceTopology;
import org.apache.skywalking.e2e.topo.ServiceInstanceTopologyMatcher;
import org.apache.skywalking.e2e.topo.ServiceInstanceTopologyQuery;
import org.apache.skywalking.e2e.topo.TopoMatcher;
import org.apache.skywalking.e2e.topo.TopoQuery;
import org.apache.skywalking.e2e.topo.Topology;
import org.apache.skywalking.e2e.trace.Trace;
import org.apache.skywalking.e2e.trace.TracesMatcher;
import org.apache.skywalking.e2e.trace.TracesQuery;
import org.junit.jupiter.api.AfterAll;
import org.junit.jupiter.api.BeforeAll;
import org.testcontainers.containers.DockerComposeContainer;
import static org.apache.skywalking.e2e.metrics.MetricsMatcher.verifyMetrics;
import static org.apache.skywalking.e2e.metrics.MetricsMatcher.verifyPercentileMetrics;
import static org.apache.skywalking.e2e.metrics.MetricsQuery.ALL_ENDPOINT_METRICS;
import static org.apache.skywalking.e2e.metrics.MetricsQuery.ALL_ENDPOINT_MULTIPLE_LINEAR_METRICS;
import static org.apache.skywalking.e2e.metrics.MetricsQuery.ALL_INSTANCE_METRICS;
import static org.apache.skywalking.e2e.metrics.MetricsQuery.ALL_SERVICE_INSTANCE_RELATION_CLIENT_METRICS;
import static org.apache.skywalking.e2e.metrics.MetricsQuery.ALL_SERVICE_INSTANCE_RELATION_SERVER_METRICS;
import static org.apache.skywalking.e2e.metrics.MetricsQuery.ALL_SERVICE_METRICS;
import static org.apache.skywalking.e2e.metrics.MetricsQuery.ALL_SERVICE_MULTIPLE_LINEAR_METRICS;
import static org.apache.skywalking.e2e.metrics.MetricsQuery.ALL_SERVICE_RELATION_CLIENT_METRICS;
import static org.apache.skywalking.e2e.metrics.MetricsQuery.ALL_SERVICE_RELATION_SERVER_METRICS;
import static org.apache.skywalking.e2e.utils.Times.now;
import static org.apache.skywalking.e2e.utils.Yamls.load;
@Slf4j
@SkyWalkingE2E
public class NodeJSE2E extends SkyWalkingTestAdapter {
@SuppressWarnings("unused")
@DockerCompose("docker/nodejs/docker-compose.yml")
private DockerComposeContainer<?> justForSideEffects;
@SuppressWarnings("unused")
@ContainerHostAndPort(name = "ui", port = 8080)
private HostAndPort swWebappHostPort;
@SuppressWarnings("unused")
@ContainerHostAndPort(name = "consumer", port = 5001)
private HostAndPort nodejsHostPort;
@BeforeAll
public void setUp() throws Exception {
queryClient(swWebappHostPort);
trafficController(nodejsHostPort, "/users");
}
@AfterAll
public void tearDown() {
trafficController.stop();
}
@RetryableTest
void services() throws Exception {
List<Service> services = graphql.services(new ServicesQuery().start(startTime).end(now()));
services = services.stream().filter(s -> !s.getLabel().equals("oap::oap-server")).collect(Collectors.toList());
LOGGER.info("services: {}", services);
load("expected/nodejs/services.yml").as(ServicesMatcher.class).verify(services);
for (Service service : services) {
if ("Your_ApplicationName".equals(service.getLabel())) {
continue;
}
LOGGER.info("verifying service instances: {}", service);
verifyServiceMetrics(service);
final Instances instances = verifyServiceInstances(service);
verifyInstancesMetrics(instances);
final Endpoints endpoints = verifyServiceEndpoints(service);
verifyEndpointsMetrics(endpoints);
}
}
@RetryableTest
void traces() throws Exception {
final List<Trace> traces = graphql.traces(new TracesQuery().start(startTime).end(now()).orderByStartTime());
LOGGER.info("traces: {}", traces);
load("expected/nodejs/traces.yml").as(TracesMatcher.class).verifyLoosely(traces);
}
@RetryableTest
void topology() throws Exception {
final Topology topology = graphql.topo(new TopoQuery().stepByMinute().start(startTime.minusDays(1)).end(now()));
LOGGER.info("topology: {}", topology);
load("expected/nodejs/topo.yml").as(TopoMatcher.class).verify(topology);
verifyServiceRelationMetrics(topology.getCalls());
}
@RetryableTest
void serviceInstances() throws Exception {
final ServiceInstanceTopology topology = graphql.serviceInstanceTopo(
new ServiceInstanceTopologyQuery().stepByMinute()
.start(startTime.minusDays(1))
.end(now())
.clientServiceId("Y29uc3VtZXI=.1")
.serverServiceId("WW91cl9BcHBsaWNhdGlvbk5hbWU=.1"));
LOGGER.info("topology: {}", topology);
load("expected/nodejs/consumer-instance-topo.yml").as(ServiceInstanceTopologyMatcher.class).verify(topology);
verifyServiceInstanceRelationMetrics(topology.getCalls());
}
private Instances verifyServiceInstances(final Service service) throws Exception {
final Instances instances = graphql.instances(
new InstancesQuery().serviceId(service.getKey()).start(startTime).end(now())
);
LOGGER.info("instances: {} {}", service.getLabel(), instances);
load(String.format("expected/nodejs/%s-instances.yml", service.getLabel()))
.as(InstancesMatcher.class)
.verify(instances);
return instances;
}
private Endpoints verifyServiceEndpoints(final Service service) throws Exception {
final Endpoints endpoints = graphql.endpoints(new EndpointQuery().serviceId(service.getKey()));
LOGGER.info("endpoints: {} {}", service.getLabel(), endpoints);
load(String.format("expected/nodejs/%s-endpoints.yml", service.getLabel()))
.as(EndpointsMatcher.class)
.verify(endpoints);
return endpoints;
}
private void verifyInstancesMetrics(Instances instances) throws Exception {
for (Instance instance : instances.getInstances()) {
for (String metricsName : ALL_INSTANCE_METRICS) {
LOGGER.info("verifying service instance response time: {}", instance);
final Metrics instanceMetrics = graphql.metrics(
new MetricsQuery().stepByMinute().metricsName(metricsName).id(instance.getKey())
);
LOGGER.info("instance metrics: {}", instanceMetrics);
final AtLeastOneOfMetricsMatcher instanceRespTimeMatcher = new AtLeastOneOfMetricsMatcher();
final MetricsValueMatcher greaterThanZero = new MetricsValueMatcher();
greaterThanZero.setValue("gt 0");
instanceRespTimeMatcher.setValue(greaterThanZero);
instanceRespTimeMatcher.verify(instanceMetrics);
LOGGER.info("{}: {}", metricsName, instanceMetrics);
}
}
}
private void verifyEndpointsMetrics(Endpoints endpoints) throws Exception {
for (Endpoint endpoint : endpoints.getEndpoints()) {
for (final String metricName : ALL_ENDPOINT_METRICS) {
LOGGER.info("verifying endpoint {}: {}", endpoint, metricName);
final Metrics metrics = graphql.metrics(
new MetricsQuery().stepByMinute().metricsName(metricName).id(endpoint.getKey())
);
LOGGER.info("metrics: {}", metrics);
final AtLeastOneOfMetricsMatcher instanceRespTimeMatcher = new AtLeastOneOfMetricsMatcher();
final MetricsValueMatcher greaterThanZero = new MetricsValueMatcher();
greaterThanZero.setValue("gt 0");
instanceRespTimeMatcher.setValue(greaterThanZero);
instanceRespTimeMatcher.verify(metrics);
LOGGER.info("{}: {}", metricName, metrics);
}
for (String metricName : ALL_ENDPOINT_MULTIPLE_LINEAR_METRICS) {
verifyPercentileMetrics(graphql, metricName, endpoint.getKey(), startTime);
}
}
}
private void verifyServiceMetrics(final Service service) throws Exception {
for (String metricName : ALL_SERVICE_METRICS) {
LOGGER.info("verifying service {}, metrics: {}", service, metricName);
final Metrics serviceMetrics = graphql.metrics(
new MetricsQuery().stepByMinute().metricsName(metricName).id(service.getKey())
);
LOGGER.info("serviceMetrics: {}", serviceMetrics);
final AtLeastOneOfMetricsMatcher instanceRespTimeMatcher = new AtLeastOneOfMetricsMatcher();
final MetricsValueMatcher greaterThanZero = new MetricsValueMatcher();
greaterThanZero.setValue("gt 0");
instanceRespTimeMatcher.setValue(greaterThanZero);
instanceRespTimeMatcher.verify(serviceMetrics);
LOGGER.info("{}: {}", metricName, serviceMetrics);
}
for (String metricName : ALL_SERVICE_MULTIPLE_LINEAR_METRICS) {
verifyPercentileMetrics(graphql, metricName, service.getKey(), startTime);
}
}
private void verifyServiceInstanceRelationMetrics(final List<Call> calls) throws Exception {
verifyRelationMetrics(
calls, ALL_SERVICE_INSTANCE_RELATION_CLIENT_METRICS,
ALL_SERVICE_INSTANCE_RELATION_SERVER_METRICS
);
}
private void verifyServiceRelationMetrics(final List<Call> calls) throws Exception {
verifyRelationMetrics(calls, ALL_SERVICE_RELATION_CLIENT_METRICS, ALL_SERVICE_RELATION_SERVER_METRICS);
}
private void verifyRelationMetrics(final List<Call> calls,
final String[] relationClientMetrics,
final String[] relationServerMetrics) throws Exception {
for (Call call : calls) {
for (String detectPoint : call.getDetectPoints()) {
switch (detectPoint) {
case "CLIENT": {
for (String metricName : relationClientMetrics) {
verifyMetrics(graphql, metricName, call.getId(), startTime);
}
break;
}
case "SERVER": {
for (String metricName : relationServerMetrics) {
verifyMetrics(graphql, metricName, call.getId(), startTime);
}
break;
}
}
}
}
}
}
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
endpoints:
- key: not null
label: /users
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
nodes:
- id: not null
name: consumer-instance
serviceId: not null
serviceName: consumer
isReal: true
- id: not null
name: not null
serviceId: not null
serviceName: Your_ApplicationName
isReal: true
calls:
- id: not null
source: Y29uc3VtZXI=.1_Y29uc3VtZXItaW5zdGFuY2U=
detectPoints:
- CLIENT
- SERVER
target: not null
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
instances:
- key: not null
label: not null
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
endpoints:
- key: not null
label: /users
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
instances:
- key: not null
label: not null
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
services:
- key: not null
label: provider
- key: not null
label: consumer
- key: not null
label: Your_ApplicationName
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
nodes:
- id: not null
name: User
type: USER
isReal: false
- id: not null
name: consumer
type: http
isReal: true
- id: not null
name: provider
type: Express
isReal: true
- id: not null
name: Your_ApplicationName
type: Tomcat
isReal: true
calls:
- id: not null
source: ${User[0]}
detectPoints:
- SERVER
target: ${consumer[0]}
- id: not null
source: ${consumer[0]}
detectPoints:
- CLIENT
- SERVER
target: ${Your_ApplicationName[0]}
- id: not null
source: ${Your_ApplicationName[0]}
detectPoints:
- CLIENT
- SERVER
target: ${provider[0]}
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
traces:
- key: not null
endpointNames:
- /users
duration: ge 0
start: gt 0
isError: false
traceIds:
- not null
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册