Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: fix spelling issues #5650

Open
wants to merge 3 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/aggregations.md
Original file line number Diff line number Diff line change
Expand Up @@ -141,7 +141,7 @@ The following aggregation functions are currently supported:

The `first` and `last` aggregation function calculate the first and last
value in an interval by sorting the data by `id`; `graph-node` enforces
correctness here by automatically setting the `id` for timeseries entities.
correctness hereby automatically setting the `id` for timeseries entities.

#### Aggregation expressions

Expand Down
2 changes: 1 addition & 1 deletion docs/implementation/metadata.md
Original file line number Diff line number Diff line change
Expand Up @@ -144,7 +144,7 @@ should have the 'account-like' optimization turned on.

### `subgraphs.subgraph_features`

Details about features that a deployment uses, Maintained in the primary.
Details about features that a deployment uses are maintained in the primary.

| Column | Type | Use |
|----------------|-----------|-------------|
Expand Down
2 changes: 1 addition & 1 deletion docs/implementation/sql-query-generation.md
Original file line number Diff line number Diff line change
Expand Up @@ -300,7 +300,7 @@ from the database is that it is impossible with Diesel to execute queries
where the number and types of the columns of the result are not known at
compile time.

We need to to be careful though to not convert to JSONB too early, as that
We need to be careful though to not convert to JSONB too early, as that
is slow when done for large numbers of rows. Deferring conversion is
responsible for some of the complexity in these queries.

Expand Down