Skip to content

Conversation

som-snytt
Copy link
Contributor

@som-snytt som-snytt commented Jul 20, 2022

Implicit definitions are required to have an explicit type ascribed in Scala 3. Now Scala 2 warns if the type is missing (and therefore inferred). Under -Xsource:3, the warning becomes an error.

Local implicits are exempt because they are not available before they are defined.

The warnings may be suppressed globally with -Wconf:cat=other-implicit-type:s or locally using @annotation.nowarn("cat=other-implicit-type").

Note that a whitebox macro may be declared of type Any and arbitrarily narrowed by its expansion.

Fixes scala/bug#5265

@scala-jenkins scala-jenkins added this to the 2.13.10 milestone Jul 20, 2022
@som-snytt
Copy link
Contributor Author

-Werror FTW.

[error] /home/jenkins/workspace/scala-2.13.x-validate-main/src/reflect/scala/reflect/api/Internals.scala:1101:18: Implicit definition should have explicit type
[error]     implicit val token = new CompatToken
[error]                  ^
[error] one error found

@som-snytt som-snytt force-pushed the issue/5265-require-explicit-type-for-implicit branch 7 times, most recently from 5ff8926 to f636f20 Compare July 22, 2022 23:33
@som-snytt som-snytt marked this pull request as ready for review July 22, 2022 23:35
@som-snytt som-snytt modified the milestones: 2.13.10, 2.13.9 Jul 22, 2022
@som-snytt som-snytt added the release-notes worth highlighting in next release notes label Jul 22, 2022
@som-snytt som-snytt force-pushed the issue/5265-require-explicit-type-for-implicit branch from f636f20 to a87d504 Compare July 22, 2022 23:44
import scala.language.reflectiveCalls
import scala.language.implicitConversions

object Test extends App {
@nowarn // the inferred type includes the default arg, which can't be written explicitly
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dwijnand inviting your thoughts on your example: implicits should have explicit types, but it is willing to infer a type that can't be written explicitly. In this case, the default arg is not allowed in the refinement. (One could introduce a trait, but that would negate the test.) Probably you will say: "Rules are made to be broken."

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice find!

@som-snytt som-snytt force-pushed the issue/5265-require-explicit-type-for-implicit branch from a87d504 to 72e65b7 Compare July 23, 2022 01:03
@som-snytt
Copy link
Contributor Author

Fields are not marked implicit, and can't access the accessor without cycles, so mark it and check when typing the getter.

@som-snytt som-snytt force-pushed the issue/5265-require-explicit-type-for-implicit branch 2 times, most recently from d3d304d to 9522547 Compare July 23, 2022 02:47
@SethTisue

This comment was marked as outdated.

@som-snytt
Copy link
Contributor Author

I didn't switch to draft after handling val, and still doesn't handle val in traits. Plus cycle of test breakage.

@som-snytt som-snytt marked this pull request as draft July 25, 2022 13:45
@SethTisue SethTisue modified the milestones: 2.13.9, 2.13.10 Jul 25, 2022
@som-snytt som-snytt changed the title Warn if implicit should have explicit type Warn that implicit should have explicit type Aug 11, 2022
@som-snytt som-snytt force-pushed the issue/5265-require-explicit-type-for-implicit branch from 9522547 to ac2e6ad Compare August 21, 2022 22:26
@som-snytt
Copy link
Contributor Author

Hung again. I don't see the red ex for canceling jenkins job.

Travis:

[info] + IndexScript.allPackages: OK, proved property.
[info] Elapsed time: 0.432 sec 
failing seed for mutable.TreeMapProjection.get, contains is WGrJJZh_6eL-dRMg_fX5Komy1UboU3eTd2SlGrJ3d9G=
[info] ! mutable.TreeMapProjection.get, contains: Exception raised on property evaluation.
[info] > ARG_0: HashMap( -> 0)
[info] > ARG_1: None
[info] > ARG_2: Some()
[info] > ARG_0_ORIGINAL: HashMap(պ -> -2147483648)
[info] > ARG_1_ORIGINAL: Some()
[info] > Exception: java.lang.NullPointerException: null
[info] scala.collection.mutable.TreeMap$TreeMapProjection.isInsideViewBounds(TreeMap.scala:193)
[info] scala.collection.mutable.TreeMap$TreeMapProjection.contains(TreeMap.scala:211)
[info] scala.collection.mutable.MutableTreeMapProjectionProperties$.$anonfun$new$125(MutableTreeMap.scala:238)
[info] scala.collection.mutable.MutableTreeMapProjectionProperties$.$anonfun$new$125$adapted(MutableTreeMap.scala:237)
[info] scala.collection.IterableOnceOps.forall(IterableOnce.scala:589)
[info] scala.collection.IterableOnceOps.forall$(IterableOnce.scala:586)
[info] scala.collection.AbstractIterable.forall(Iterable.scala:933)
[info] scala.collection.mutable.MutableTreeMapProjectionProperties$.$anonfun$new$124(MutableTreeMap.scala:237)
[info] scala.collection.mutable.MutableTreeMapProjectionProperties$.$anonfun$new$124$adapted(MutableTreeMap.scala:230)

Locally,

[info] + Range inclusiveByOne.foreach.visited.size: OK, passed 100 tests.
[info] Elapsed time: 0.002 sec
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x00007fad5f9c03fc, pid=30176, tid=30206
#
# JRE version: OpenJDK Runtime Environment (18.0.1.1+2) (build 18.0.1.1+2-6)
# Java VM: OpenJDK 64-Bit Server VM (18.0.1.1+2-6, mixed mode, sharing, tiered, compressed oops, compressed class ptrs, g1 gc, linux-amd64)
# Problematic frame:
# V  [libjvm.so+0xd263fc][info] + Range inclusiveByOne.sum: OK, passed 100 tests.
[info] Elapsed time: 0.002 sec
[info] + Range inclusiveByOne.sumCustomNumeric: OK, passed 100 tests.
[info] Elapsed time: 0.004 sec

Travis says both

export ADOPTOPENJDK=8
$ java -Xmx32m -version
openjdk version "11.0.2" 2019-01-15
OpenJDK Runtime Environment 18.9 (build 11.0.2+9)
OpenJDK 64-Bit Server VM 18.9 (build 11.0.2+9, mixed mode)
$ javac -J-Xmx32m -version
javac 11.0.2

@som-snytt
Copy link
Contributor Author

I should have @SethTisue about cancel culture.

@SethTisue

This comment was marked as outdated.

@som-snytt som-snytt force-pushed the issue/5265-require-explicit-type-for-implicit branch from ac2e6ad to f9395e1 Compare August 28, 2022 16:32
@SethTisue

This comment was marked as resolved.

@som-snytt som-snytt force-pushed the issue/5265-require-explicit-type-for-implicit branch from ac5a670 to 1659fdc Compare November 23, 2022 01:09
Copy link
Member

@lrytz lrytz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thank you!

@SethTisue what's your view on making the warning on by default with -Wconf / @nowarn opt-out?

Copy link
Member

@SethTisue SethTisue left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe if we introduce a new warning category and publish the opt-out flag in the release notes, it's OK?

Yes, I think that's sufficient.

@SethTisue
Copy link
Member

Note that this impacts macro authors too; see scala/bug#12798

dongjoon-hyun pushed a commit to apache/spark that referenced this pull request Jun 19, 2023
### What changes were proposed in this pull request?
This PR aims to upgrade Scala to 2.13.11
- https://www.scala-lang.org/news/2.13.11

Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3,

### Why are the changes needed?
This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`:
- scala/scala#10363
- scala/scala#10184
- scala/scala#10397
- scala/scala#10348
- scala/scala#10105

There are 2 known issues in this version:

- scala/bug#12800
- scala/bug#12799

For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just

https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130

in Spark Code, and I checked `javax.annotation.Nonnull` no this issue.

So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves

The full release notes as follows:

- https://github.com/scala/scala/releases/tag/v2.13.11

### Does this PR introduce _any_ user-facing change?
Yes, this is a Scala version change.

### How was this patch tested?
- Existing Test
- Checked Java 8/17 + Scala 2.13.11 using GA, all test passed

Java 8 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14337279564
Java 17 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14343012195

Closes #41626 from LuciferYang/SPARK-40497.

Authored-by: yangjie01 <yangjie01@baidu.com>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
czxm pushed a commit to czxm/spark that referenced this pull request Jun 19, 2023
### What changes were proposed in this pull request?
This PR aims to upgrade Scala to 2.13.11
- https://www.scala-lang.org/news/2.13.11

Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3,

### Why are the changes needed?
This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`:
- scala/scala#10363
- scala/scala#10184
- scala/scala#10397
- scala/scala#10348
- scala/scala#10105

There are 2 known issues in this version:

- scala/bug#12800
- scala/bug#12799

For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just

https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130

in Spark Code, and I checked `javax.annotation.Nonnull` no this issue.

So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves

The full release notes as follows:

- https://github.com/scala/scala/releases/tag/v2.13.11

### Does this PR introduce _any_ user-facing change?
Yes, this is a Scala version change.

### How was this patch tested?
- Existing Test
- Checked Java 8/17 + Scala 2.13.11 using GA, all test passed

Java 8 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14337279564
Java 17 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14343012195

Closes apache#41626 from LuciferYang/SPARK-40497.

Authored-by: yangjie01 <yangjie01@baidu.com>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
nicl added a commit to guardian/amigo that referenced this pull request Jun 26, 2023
Scala 2.13 raises compilation warnings (which we have elevated to
errors) for implicits without explicit types. This is to mirror
Scala 3 which requires this.

See:

* scala/scala#10083
* https://github.com/scala/scala/releases/tag/v2.13.11
nicl added a commit to guardian/amigo that referenced this pull request Jun 26, 2023
Scala 2.13 raises compilation warnings (which we have elevated to
errors) for implicits without explicit types. This is to mirror
Scala 3 which requires this.

See:

* scala/scala#10083
* https://github.com/scala/scala/releases/tag/v2.13.11
DavidLawes added a commit to guardian/mobile-n10n that referenced this pull request Jul 4, 2023
we treat warnings as fatal, so due to scala/scala#10083 we need to make sure that all implicits have an explicit type
dongjoon-hyun pushed a commit to apache/spark that referenced this pull request Sep 16, 2023
### What changes were proposed in this pull request?
This PR aims to re-upgrade Scala to 2.13.11, after SPARK-45144 was merged, the build issues mentioned in #41943 should no longer exist.

- https://www.scala-lang.org/news/2.13.11

Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3

### Why are the changes needed?
This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`:
- scala/scala#10363
- scala/scala#10184
- scala/scala#10397
- scala/scala#10348
- scala/scala#10105

There are 2 known issues in this version:

- scala/bug#12800
- scala/bug#12799

For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just

https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130

in Spark Code, and I checked `javax.annotation.Nonnull` no this issue.

So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves

The full release notes as follows:

- https://github.com/scala/scala/releases/tag/v2.13.11

### Does this PR introduce _any_ user-facing change?
Yes, this is a Scala version change.

### How was this patch tested?
- Existing Test

### Was this patch authored or co-authored using generative AI tooling?
No

Closes #42918 from LuciferYang/SPARK-40497-2.

Authored-by: yangjie01 <yangjie01@baidu.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
szehon-ho pushed a commit to szehon-ho/spark that referenced this pull request Aug 7, 2024
This PR aims to re-upgrade Scala to 2.13.11, after SPARK-45144 was merged, the build issues mentioned in apache#41943 should no longer exist.

- https://www.scala-lang.org/news/2.13.11

Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3

This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`:
- scala/scala#10363
- scala/scala#10184
- scala/scala#10397
- scala/scala#10348
- scala/scala#10105

There are 2 known issues in this version:

- scala/bug#12800
- scala/bug#12799

For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just

https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130

in Spark Code, and I checked `javax.annotation.Nonnull` no this issue.

So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves

The full release notes as follows:

- https://github.com/scala/scala/releases/tag/v2.13.11

Yes, this is a Scala version change.

- Existing Test

No

Closes apache#42918 from LuciferYang/SPARK-40497-2.

Authored-by: yangjie01 <yangjie01@baidu.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
rtyley added a commit to guardian/tagmanager that referenced this pull request Oct 22, 2024
This is a precursor to:

* #536

Scala 2.13.11 added a warning that implicit definitions [should](
https://nrinaudo.github.io/scala-best-practices/tricky_behaviours/type_implicits.html) have an explicit type (because implicit resolution is already complicated enough for the compiler, and things like file-order can actually make a difference to whether implicit scopes are correctly searched: scala/bug#8697 (comment)):

* scala/scala#10083

The warning looks like this:

```
[error] ~/code/presence-indicator/app/autoscaling/Notification.scala:7:16: Implicit definition should have explicit type (inferred play.api.libs.json.Reads[autoscaling.Notification]) [quickfixable]
[error]   implicit val jsonReads = Json.reads[Notification]
[error]                ^
```

...this prepares us for Scala 3, where the explicit type is _required_.

More widely, beyond implicit definitions, in **library** code, the best practice is to always add an explicit type to all your **public** members, even when you're happy with what's being inferred - otherwise you can unintentionally break binary-compatibility just by changing the _implementation_ of a field:

* https://docs.scala-lang.org/scala3/guides/migration/incompat-type-inference.html
* https://nrinaudo.github.io/scala-best-practices/binary_compat/explicit_type_annotations.html
* https://scalacenter.github.io/scalafix/docs/rules/ExplicitResultTypes.html

## Automatically fixing this code issue

Scalafix actually does a better job than `-quickfix` for this particular task, because it adds imports if it needs to, so that you end up with this in your code:

```
implicit val jsonReads Reads[Notification] = Json.reads[Notification]
```

...rather than something like:

```
implicit val jsonReads Reads[com.gu.blah.foo.bar.Notification] = Json.reads[Notification]
```

### Fixing while still on Scala 2.12 - use Scalafix

In this commit, we're only trying to fix the implicit definitions, so I've added this in a new `.scalafix.conf` config file:

```
ExplicitResultTypes.onlyImplicits = true
```

The Scalafix rule needs to be run while the project is still on Scala 2.12, not Scala 2.13
(otherwise sbt will say: "Error downloading ch.epfl.scala:sbt-scalafix;sbtVersion=1.0;scalaVersion=2.13:0.13.0").

Once the Scalafix plugin is made available to sbt (by adding `addSbtPlugin("ch.epfl.scala" % "sbt-scalafix" % "0.13.0")`
to either `project/plugins.sbt` or `~/.sbt/1.0/plugins.sbt`), you can run these commands on the sbt prompt to automatically generate the changes in this PR:

```
scalafixEnable
scalafixAll ExplicitResultTypes
```
rtyley added a commit to guardian/tagmanager that referenced this pull request Oct 22, 2024
This is a precursor to:

* #536

Scala 2.13.11 added a warning that implicit definitions [should](
https://nrinaudo.github.io/scala-best-practices/tricky_behaviours/type_implicits.html) have an explicit type (because implicit resolution is already complicated enough for the compiler, and things like file-order can actually make a difference to whether implicit scopes are correctly searched: scala/bug#8697 (comment)):

* scala/scala#10083

The warning looks like this:

```
[error] ~/code/presence-indicator/app/autoscaling/Notification.scala:7:16: Implicit definition should have explicit type (inferred play.api.libs.json.Reads[autoscaling.Notification]) [quickfixable]
[error]   implicit val jsonReads = Json.reads[Notification]
[error]                ^
```

...this prepares us for Scala 3, where the explicit type is _required_.

More widely, beyond implicit definitions, in **library** code, the best practice is to always add an explicit type to all your **public** members, even when you're happy with what's being inferred - otherwise you can unintentionally break binary-compatibility just by changing the _implementation_ of a field:

* https://docs.scala-lang.org/scala3/guides/migration/incompat-type-inference.html
* https://nrinaudo.github.io/scala-best-practices/binary_compat/explicit_type_annotations.html
* https://scalacenter.github.io/scalafix/docs/rules/ExplicitResultTypes.html

## Automatically fixing this code issue

Scalafix actually does a better job than `-quickfix` for this particular task, because it adds imports if it needs to, so that you end up with this in your code:

```
implicit val jsonReads Reads[Notification] = Json.reads[Notification]
```

...rather than something like:

```
implicit val jsonReads Reads[com.gu.blah.foo.bar.Notification] = Json.reads[Notification]
```

### Fixing while still on Scala 2.12 - use Scalafix

In this commit, we're only trying to fix the implicit definitions, so I've added this in a new `.scalafix.conf` config file:

```
ExplicitResultTypes.onlyImplicits = true
```

The Scalafix rule needs to be run while the project is still on Scala 2.12, not Scala 2.13
(otherwise sbt will say: "Error downloading ch.epfl.scala:sbt-scalafix;sbtVersion=1.0;scalaVersion=2.13:0.13.0").

Once the Scalafix plugin is made available to sbt (by adding `addSbtPlugin("ch.epfl.scala" % "sbt-scalafix" % "0.13.0")`
to either `project/plugins.sbt` or `~/.sbt/1.0/plugins.sbt`), you can run these commands on the sbt prompt to automatically generate the changes in this PR:

```
scalafixEnable
scalafixAll ExplicitResultTypes
```
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
release-notes worth highlighting in next release notes
Projects
None yet
Development

Successfully merging this pull request may close these issues.

warn on implicit def without explicit result type
5 participants