mirror of
https://github.com/nim-lang/Nim.git
synced 2026-01-21 20:10:44 +00:00
4
.gitignore
vendored
4
.gitignore
vendored
@@ -57,3 +57,7 @@ dist/
|
||||
# Private directories and files (IDEs)
|
||||
.*/
|
||||
~*
|
||||
|
||||
# testament cruft
|
||||
testresults/
|
||||
test.txt
|
||||
|
||||
@@ -46,3 +46,7 @@ script:
|
||||
- tests/testament/tester --pedantic all -d:nimCoroutines
|
||||
- ./koch web
|
||||
- ./koch csource
|
||||
- ./koch nimsuggest
|
||||
# - nim c -r nimsuggest/tester
|
||||
- ( ! grep -F '.. code-block' -l -r --include '*.html' --exclude contributing.html --exclude docgen.html --exclude tut2.html )
|
||||
- ( ! grep -F '..code-block' -l -r --include '*.html' --exclude contributing.html --exclude docgen.html --exclude tut2.html )
|
||||
|
||||
190
changelog.md
190
changelog.md
@@ -2,11 +2,191 @@
|
||||
|
||||
### Changes affecting backwards compatibility
|
||||
|
||||
- Removed basic2d/basic3d out of the stdlib and into Nimble packages.
|
||||
These packages deprecated however, use the ``glm``, ``arraymancer``, ``neo``
|
||||
or another package.
|
||||
|
||||
- Arrays of char cannot be converted to ``cstring`` anymore, pointers to
|
||||
arrays of char can! This means ``$`` for arrays can finally exist
|
||||
in ``system.nim`` and do the right thing.
|
||||
- JSON: Deprecated `getBVal`, `getFNum`, and `getNum` in favour to
|
||||
`getBool`, `getFloat`, `getBiggestInt`. Also `getInt` procedure was added.
|
||||
- ``echo`` now works with strings that contain ``\0`` (the binary zero is not
|
||||
shown) and ``nil`` strings are equal to empty strings.
|
||||
- JSON: Deprecated `getBVal`, `getFNum`, and `getNum` in favour to
|
||||
`getBool`, `getFloat`, `getBiggestInt`. Also `getInt` procedure was added.
|
||||
- `reExtended` is no longer default for the `re` constructor in the `re`
|
||||
module.
|
||||
- The overloading rules changed slightly so that constrained generics are
|
||||
preferred over unconstrained generics. (Bug #6526)
|
||||
- It is now possible to forward declare object types so that mutually
|
||||
recursive types can be created across module boundaries. See
|
||||
[package level objects](https://nim-lang.org/docs/manual.html#package-level-objects)
|
||||
for more information.
|
||||
- The **unary** ``<`` is now deprecated, for ``.. <`` use ``..<`` for other usages
|
||||
use the ``pred`` proc.
|
||||
- We changed how array accesses "from backwards" like ``a[^1]`` or ``a[0..^1]`` are
|
||||
implemented. These are now implemented purely in ``system.nim`` without compiler
|
||||
support. There is a new "heterogenous" slice type ``system.HSlice`` that takes 2
|
||||
generic parameters which can be ``BackwardsIndex`` indices. ``BackwardsIndex`` is
|
||||
produced by ``system.^``.
|
||||
This means if you overload ``[]`` or ``[]=`` you need to ensure they also work
|
||||
with ``system.BackwardsIndex`` (if applicable for the accessors).
|
||||
- ``mod`` and bitwise ``and`` do not produce ``range`` subtypes anymore. This
|
||||
turned out to be more harmful than helpful and the language is simpler
|
||||
without this special typing rule.
|
||||
- Added ``algorithm.rotateLeft``.
|
||||
- ``rationals.toRational`` now uses an algorithm based on continued fractions.
|
||||
This means its results are more precise and it can't run into an infinite loop
|
||||
anymore.
|
||||
- Added ``typetraits.$`` as an alias for ``typetraits.name``.
|
||||
- ``os.getEnv`` now takes an optional ``default`` parameter that tells ``getEnv``
|
||||
what to return if the environment variable does not exist.
|
||||
- Bodies of ``for`` loops now get their own scope:
|
||||
|
||||
```nim
|
||||
# now compiles:
|
||||
for i in 0..4:
|
||||
let i = i + 1
|
||||
echo i
|
||||
```
|
||||
|
||||
- The parsing rules of ``if`` expressions were changed so that multiple
|
||||
statements are allowed in the branches. We found few code examples that
|
||||
now fail because of this change, but here is one:
|
||||
|
||||
```nim
|
||||
t[ti] = if exp_negative: '-' else: '+'; inc(ti)
|
||||
```
|
||||
|
||||
This now needs to be written as:
|
||||
|
||||
```nim
|
||||
t[ti] = (if exp_negative: '-' else: '+'); inc(ti)
|
||||
```
|
||||
|
||||
- To make Nim even more robust the system iterators ``..`` and ``countup``
|
||||
now only accept a single generic type ``T``. This means the following code
|
||||
doesn't die with an "out of range" error anymore:
|
||||
|
||||
```nim
|
||||
var b = 5.Natural
|
||||
var a = -5
|
||||
for i in a..b:
|
||||
echo i
|
||||
```
|
||||
|
||||
- ``formatFloat``/``formatBiggestFloat`` now support formatting floats with zero
|
||||
precision digits. The previous ``precision = 0`` behavior (default formatting)
|
||||
is now available via ``precision = -1``.
|
||||
- The ``nim doc`` command is now an alias for ``nim doc2``, the second version of
|
||||
the documentation generator. The old version 1 can still be accessed
|
||||
via the new ``nim doc0`` command.
|
||||
- Added ``system.getStackTraceEntries`` that allows you to access the stack
|
||||
trace in a structured manner without string parsing.
|
||||
- Added ``sequtils.mapLiterals`` for easier construction of array and tuple
|
||||
literals.
|
||||
- Added ``parseutils.parseSaturatedNatural``.
|
||||
- ``atomic`` and ``generic`` are no longer keywords in Nim. ``generic`` used to be
|
||||
an alias for ``concept``, ``atomic`` was not used for anything.
|
||||
- Moved from stdlib into Nimble packages:
|
||||
- [``basic2d``](https://github.com/nim-lang/basic2d)
|
||||
_deprecated: use ``glm``, ``arraymancer``, ``neo``, or another package instead_
|
||||
- [``basic3d``](https://github.com/nim-lang/basic3d)
|
||||
_deprecated: use ``glm``, ``arraymancer``, ``neo``, or another package instead_
|
||||
- [``gentabs``](https://github.com/lcrees/gentabs)
|
||||
- [``libuv``](https://github.com/lcrees/libuv)
|
||||
- [``numeric``](https://github.com/lcrees/polynumeric)
|
||||
- [``poly``](https://github.com/lcrees/polynumeric)
|
||||
- [``pdcurses``](https://github.com/lcrees/pdcurses)
|
||||
- [``romans``](https://github.com/lcrees/romans)
|
||||
|
||||
- Added ``system.runnableExamples`` to make examples in Nim's documentation easier
|
||||
to write and test. The examples are tested as the last step of
|
||||
``nim doc``.
|
||||
- Nim's ``rst2html`` command now supports the testing of code snippets via an RST
|
||||
extension that we called ``:test:``::
|
||||
|
||||
```rst
|
||||
.. code-block:: nim
|
||||
:test:
|
||||
# shows how the 'if' statement works
|
||||
if true: echo "yes"
|
||||
```
|
||||
- The ``[]`` proc for strings now raises an ``IndexError`` exception when
|
||||
the specified slice is out of bounds. See issue
|
||||
[#6223](https://github.com/nim-lang/Nim/issues/6223) for more details.
|
||||
You can use ``substr(str, start, finish)`` to get the old behaviour back,
|
||||
see [this commit](https://github.com/nim-lang/nimbot/commit/98cc031a27ea89947daa7f0bb536bcf86462941f) for an example.
|
||||
- ``strutils.split`` and ``strutils.rsplit`` with an empty string and a
|
||||
separator now returns that empty string.
|
||||
See issue [#4377](https://github.com/nim-lang/Nim/issues/4377).
|
||||
- The experimental overloading of the dot ``.`` operators now take
|
||||
an ``untyped``` parameter as the field name, it used to be
|
||||
a ``static[string]``. You can use ``when defined(nimNewDot)`` to make
|
||||
your code work with both old and new Nim versions.
|
||||
See [special-operators](https://nim-lang.org/docs/manual.html#special-operators)
|
||||
for more information.
|
||||
- Added ``macros.unpackVarargs``.
|
||||
- The memory manager now uses a variant of the TLSF algorithm that has much
|
||||
better memory fragmentation behaviour. According
|
||||
to [http://www.gii.upv.es/tlsf/](http://www.gii.upv.es/tlsf/) the maximum
|
||||
fragmentation measured is lower than 25%. As a nice bonus ``alloc`` and
|
||||
``dealloc`` became O(1) operations.
|
||||
- The behavior of ``$`` has been changed for all standard library collections. The
|
||||
collection-to-string implementations now perform proper quoting and escaping of
|
||||
strings and chars.
|
||||
- The ``random`` procs in ``random.nim`` have all been deprecated. Instead use
|
||||
the new ``rand`` procs. The module now exports the state of the random
|
||||
number generator as type ``Rand`` so multiple threads can easily use their
|
||||
own random number generators that do not require locking. For more information
|
||||
about this rename see issue [#6934](https://github.com/nim-lang/Nim/issues/6934)
|
||||
- The compiler is now more consistent in its treatment of ambiguous symbols:
|
||||
Types that shadow procs and vice versa are marked as ambiguous (bug #6693).
|
||||
- ``yield`` (or ``await`` which is mapped to ``yield``) never worked reliably
|
||||
in an array, seq or object constructor and is now prevented at compile-time.
|
||||
- For string formatting / interpolation a new module
|
||||
called [strformat](https://nim-lang.org/docs/strformat.html) has been added
|
||||
to the stdlib.
|
||||
- codegenDecl pragma now works for the JavaScript backend. It returns an empty string for
|
||||
function return type placeholders.
|
||||
- Asynchronous programming for the JavaScript backend using the `asyncjs` module.
|
||||
- Extra semantic checks for procs with noreturn pragma: return type is not allowed,
|
||||
statements after call to noreturn procs are no longer allowed.
|
||||
- Noreturn proc calls and raising exceptions branches are now skipped during common type
|
||||
deduction in if and case expressions. The following code snippets now compile:
|
||||
```nim
|
||||
import strutils
|
||||
let str = "Y"
|
||||
let a = case str:
|
||||
of "Y": true
|
||||
of "N": false
|
||||
else: raise newException(ValueError, "Invalid boolean")
|
||||
let b = case str:
|
||||
of nil, "": raise newException(ValueError, "Invalid boolean")
|
||||
elif str.startsWith("Y"): true
|
||||
elif str.startsWith("N"): false
|
||||
else: false
|
||||
let c = if str == "Y": true
|
||||
elif str == "N": false
|
||||
else:
|
||||
echo "invalid bool"
|
||||
quit("this is the end")
|
||||
```
|
||||
- Proc [toCountTable](https://nim-lang.org/docs/tables.html#toCountTable,openArray[A]) now produces a `CountTable` with values correspoding to the number of occurrences of the key in the input. It used to produce a table with all values set to `1`.
|
||||
|
||||
Counting occurrences in a sequence used to be:
|
||||
|
||||
```nim
|
||||
let mySeq = @[1, 2, 1, 3, 1, 4]
|
||||
var myCounter = initCountTable[int]()
|
||||
|
||||
for item in mySeq:
|
||||
myCounter.inc item
|
||||
```
|
||||
|
||||
Now, you can simply do:
|
||||
|
||||
```nim
|
||||
let
|
||||
mySeq = @[1, 2, 1, 3, 1, 4]
|
||||
myCounter = mySeq.toCountTable()
|
||||
```
|
||||
|
||||
- Added support for casting between integers of same bitsize in VM (compile time and nimscript).
|
||||
This allow to among other things to reinterpret signed integers as unsigned.
|
||||
|
||||
@@ -179,5 +179,11 @@ proc isPartOf*(a, b: PNode): TAnalysisResult =
|
||||
result = isPartOf(a[0], b)
|
||||
if result == arNo: result = arMaybe
|
||||
else: discard
|
||||
of nkObjConstr:
|
||||
result = arNo
|
||||
for i in 1..<b.len:
|
||||
let res = isPartOf(a, b[i][1])
|
||||
if res != arNo:
|
||||
result = res
|
||||
if res == arYes: break
|
||||
else: discard
|
||||
|
||||
|
||||
@@ -62,8 +62,8 @@ type
|
||||
nkTripleStrLit, # a triple string literal """
|
||||
nkNilLit, # the nil literal
|
||||
# end of atoms
|
||||
nkMetaNode_Obsolete, # difficult to explain; represents itself
|
||||
# (used for macros)
|
||||
nkComesFrom, # "comes from" template/macro information for
|
||||
# better stack trace generation
|
||||
nkDotCall, # used to temporarily flag a nkCall node;
|
||||
# this is used
|
||||
# for transforming ``s.len`` to ``len(s)``
|
||||
@@ -639,7 +639,8 @@ type
|
||||
mEqIdent, mEqNimrodNode, mSameNodeType, mGetImpl,
|
||||
mNHint, mNWarning, mNError,
|
||||
mInstantiationInfo, mGetTypeInfo, mNGenSym,
|
||||
mNimvm, mIntDefine, mStrDefine
|
||||
mNimvm, mIntDefine, mStrDefine, mRunnableExamples,
|
||||
mException
|
||||
|
||||
# things that we can evaluate safely at compile time, even if not asked for it:
|
||||
const
|
||||
@@ -744,6 +745,8 @@ type
|
||||
OnUnknown, # location is unknown (stack, heap or static)
|
||||
OnStatic, # in a static section
|
||||
OnStack, # location is on hardware stack
|
||||
OnStackShadowDup, # location is on the stack but also replicated
|
||||
# on the shadow stack
|
||||
OnHeap # location is on heap or global
|
||||
# (reference counting needed)
|
||||
TLocFlags* = set[TLocFlag]
|
||||
@@ -753,6 +756,7 @@ type
|
||||
flags*: TLocFlags # location's flags
|
||||
lode*: PNode # Node where the location came from; can be faked
|
||||
r*: Rope # rope value of location (code generators)
|
||||
dup*: Rope # duplicated location for precise stack scans
|
||||
|
||||
# ---------------- end of backend information ------------------------------
|
||||
|
||||
@@ -1017,16 +1021,11 @@ proc add*(father, son: PNode) =
|
||||
|
||||
type Indexable = PNode | PType
|
||||
|
||||
template `[]`*(n: Indexable, i: int): Indexable =
|
||||
n.sons[i]
|
||||
template `[]`*(n: Indexable, i: int): Indexable = n.sons[i]
|
||||
template `[]=`*(n: Indexable, i: int; x: Indexable) = n.sons[i] = x
|
||||
|
||||
template `-|`*(b, s: untyped): untyped =
|
||||
(if b >= 0: b else: s.len + b)
|
||||
|
||||
# son access operators with support for negative indices
|
||||
template `{}`*(n: Indexable, i: int): untyped = n[i -| n]
|
||||
template `{}=`*(n: Indexable, i: int, s: Indexable) =
|
||||
n.sons[i -| n] = s
|
||||
template `[]`*(n: Indexable, i: BackwardsIndex): Indexable = n[n.len - i.int]
|
||||
template `[]=`*(n: Indexable, i: BackwardsIndex; x: Indexable) = n[n.len - i.int] = x
|
||||
|
||||
when defined(useNodeIds):
|
||||
const nodeIdToDebug* = -1 # 299750 # 300761 #300863 # 300879
|
||||
@@ -1036,9 +1035,9 @@ proc newNode*(kind: TNodeKind): PNode =
|
||||
new(result)
|
||||
result.kind = kind
|
||||
#result.info = UnknownLineInfo() inlined:
|
||||
result.info.fileIndex = int32(- 1)
|
||||
result.info.col = int16(- 1)
|
||||
result.info.line = int16(- 1)
|
||||
result.info.fileIndex = int32(-1)
|
||||
result.info.col = int16(-1)
|
||||
result.info.line = int16(-1)
|
||||
when defined(useNodeIds):
|
||||
result.id = gNodeId
|
||||
if result.id == nodeIdToDebug:
|
||||
@@ -1392,6 +1391,14 @@ proc skipTypes*(t: PType, kinds: TTypeKinds): PType =
|
||||
result = t
|
||||
while result.kind in kinds: result = lastSon(result)
|
||||
|
||||
proc skipTypes*(t: PType, kinds: TTypeKinds; maxIters: int): PType =
|
||||
result = t
|
||||
var i = maxIters
|
||||
while result.kind in kinds:
|
||||
result = lastSon(result)
|
||||
dec i
|
||||
if i == 0: return nil
|
||||
|
||||
proc skipTypesOrNil*(t: PType, kinds: TTypeKinds): PType =
|
||||
## same as skipTypes but handles 'nil'
|
||||
result = t
|
||||
@@ -1405,7 +1412,7 @@ proc isGCedMem*(t: PType): bool {.inline.} =
|
||||
|
||||
proc propagateToOwner*(owner, elem: PType) =
|
||||
const HaveTheirOwnEmpty = {tySequence, tyOpt, tySet, tyPtr, tyRef, tyProc}
|
||||
owner.flags = owner.flags + (elem.flags * {tfHasMeta})
|
||||
owner.flags = owner.flags + (elem.flags * {tfHasMeta, tfTriggersCompileTime})
|
||||
if tfNotNil in elem.flags:
|
||||
if owner.kind in {tyGenericInst, tyGenericBody, tyGenericInvocation}:
|
||||
owner.flags.incl tfNotNil
|
||||
@@ -1420,15 +1427,12 @@ proc propagateToOwner*(owner, elem: PType) =
|
||||
owner.flags.incl tfHasMeta
|
||||
|
||||
if tfHasAsgn in elem.flags:
|
||||
let o2 = elem.skipTypes({tyGenericInst, tyAlias})
|
||||
let o2 = owner.skipTypes({tyGenericInst, tyAlias})
|
||||
if o2.kind in {tyTuple, tyObject, tyArray,
|
||||
tySequence, tyOpt, tySet, tyDistinct}:
|
||||
o2.flags.incl tfHasAsgn
|
||||
owner.flags.incl tfHasAsgn
|
||||
|
||||
if tfTriggersCompileTime in elem.flags:
|
||||
owner.flags.incl tfTriggersCompileTime
|
||||
|
||||
if owner.kind notin {tyProc, tyGenericInst, tyGenericBody,
|
||||
tyGenericInvocation, tyPtr}:
|
||||
let elemB = elem.skipTypes({tyGenericInst, tyAlias})
|
||||
@@ -1442,6 +1446,10 @@ proc rawAddSon*(father, son: PType) =
|
||||
add(father.sons, son)
|
||||
if not son.isNil: propagateToOwner(father, son)
|
||||
|
||||
proc rawAddSonNoPropagationOfTypeFlags*(father, son: PType) =
|
||||
if isNil(father.sons): father.sons = @[]
|
||||
add(father.sons, son)
|
||||
|
||||
proc addSonNilAllowed*(father, son: PNode) =
|
||||
if isNil(father.sons): father.sons = @[]
|
||||
add(father.sons, son)
|
||||
@@ -1604,10 +1612,10 @@ proc hasPattern*(s: PSym): bool {.inline.} =
|
||||
result = isRoutine(s) and s.ast.sons[patternPos].kind != nkEmpty
|
||||
|
||||
iterator items*(n: PNode): PNode =
|
||||
for i in 0.. <n.safeLen: yield n.sons[i]
|
||||
for i in 0..<n.safeLen: yield n.sons[i]
|
||||
|
||||
iterator pairs*(n: PNode): tuple[i: int, n: PNode] =
|
||||
for i in 0.. <n.len: yield (i, n.sons[i])
|
||||
for i in 0..<n.len: yield (i, n.sons[i])
|
||||
|
||||
proc isAtom*(n: PNode): bool {.inline.} =
|
||||
result = n.kind >= nkNone and n.kind <= nkNilLit
|
||||
@@ -1663,3 +1671,10 @@ when false:
|
||||
if n.isNil: return true
|
||||
for i in 0 ..< n.safeLen:
|
||||
if n[i].containsNil: return true
|
||||
|
||||
template hasDestructor*(t: PType): bool = tfHasAsgn in t.flags
|
||||
template incompleteType*(t: PType): bool =
|
||||
t.sym != nil and {sfForward, sfNoForward} * t.sym.flags == {sfForward}
|
||||
|
||||
template typeCompleted*(s: PSym) =
|
||||
incl s.flags, sfNoForward
|
||||
|
||||
@@ -102,7 +102,7 @@ proc hashTree(c: var MD5Context, n: PNode) =
|
||||
of nkStrLit..nkTripleStrLit:
|
||||
c &= n.strVal
|
||||
else:
|
||||
for i in 0.. <n.len: hashTree(c, n.sons[i])
|
||||
for i in 0..<n.len: hashTree(c, n.sons[i])
|
||||
|
||||
proc hashType(c: var MD5Context, t: PType) =
|
||||
# modelled after 'typeToString'
|
||||
@@ -151,13 +151,13 @@ proc hashType(c: var MD5Context, t: PType) =
|
||||
c.hashType(t.sons[0])
|
||||
of tyProc:
|
||||
c &= (if tfIterator in t.flags: "iterator " else: "proc ")
|
||||
for i in 0.. <t.len: c.hashType(t.sons[i])
|
||||
for i in 0..<t.len: c.hashType(t.sons[i])
|
||||
md5Update(c, cast[cstring](addr(t.callConv)), 1)
|
||||
|
||||
if tfNoSideEffect in t.flags: c &= ".noSideEffect"
|
||||
if tfThread in t.flags: c &= ".thread"
|
||||
else:
|
||||
for i in 0.. <t.len: c.hashType(t.sons[i])
|
||||
for i in 0..<t.len: c.hashType(t.sons[i])
|
||||
if tfNotNil in t.flags: c &= "not nil"
|
||||
|
||||
proc canonConst(n: PNode): TUid =
|
||||
|
||||
@@ -11,7 +11,7 @@
|
||||
|
||||
proc leftAppearsOnRightSide(le, ri: PNode): bool =
|
||||
if le != nil:
|
||||
for i in 1 .. <ri.len:
|
||||
for i in 1 ..< ri.len:
|
||||
let r = ri[i]
|
||||
if isPartOf(le, r) != arNo: return true
|
||||
|
||||
@@ -364,7 +364,7 @@ proc genPatternCall(p: BProc; ri: PNode; pat: string; typ: PType): Rope =
|
||||
of '@':
|
||||
if j < ri.len:
|
||||
result.add genOtherArg(p, ri, j, typ)
|
||||
for k in j+1 .. < ri.len:
|
||||
for k in j+1 ..< ri.len:
|
||||
result.add(~", ")
|
||||
result.add genOtherArg(p, ri, k, typ)
|
||||
inc i
|
||||
@@ -377,7 +377,7 @@ proc genPatternCall(p: BProc; ri: PNode; pat: string; typ: PType): Rope =
|
||||
result.add(~"(")
|
||||
if 1 < ri.len:
|
||||
result.add genOtherArg(p, ri, 1, typ)
|
||||
for k in j+1 .. < ri.len:
|
||||
for k in j+1 ..< ri.len:
|
||||
result.add(~", ")
|
||||
result.add genOtherArg(p, ri, k, typ)
|
||||
result.add(~")")
|
||||
|
||||
@@ -228,7 +228,7 @@ proc genOptAsgnTuple(p: BProc, dest, src: TLoc, flags: TAssignmentFlags) =
|
||||
else:
|
||||
flags
|
||||
let t = skipTypes(dest.t, abstractInst).getUniqueType()
|
||||
for i in 0 .. <t.len:
|
||||
for i in 0 ..< t.len:
|
||||
let t = t.sons[i]
|
||||
let field = "Field$1" % [i.rope]
|
||||
genAssignment(p, optAsgnLoc(dest, t, field),
|
||||
@@ -270,10 +270,10 @@ proc genGenericAsgn(p: BProc, dest, src: TLoc, flags: TAssignmentFlags) =
|
||||
addrLoc(dest), addrLoc(src), rdLoc(dest))
|
||||
else:
|
||||
linefmt(p, cpsStmts, "#genericShallowAssign((void*)$1, (void*)$2, $3);$n",
|
||||
addrLoc(dest), addrLoc(src), genTypeInfo(p.module, dest.t))
|
||||
addrLoc(dest), addrLoc(src), genTypeInfo(p.module, dest.t, dest.lode.info))
|
||||
else:
|
||||
linefmt(p, cpsStmts, "#genericAssign((void*)$1, (void*)$2, $3);$n",
|
||||
addrLoc(dest), addrLoc(src), genTypeInfo(p.module, dest.t))
|
||||
addrLoc(dest), addrLoc(src), genTypeInfo(p.module, dest.t, dest.lode.info))
|
||||
|
||||
proc genAssignment(p: BProc, dest, src: TLoc, flags: TAssignmentFlags) =
|
||||
# This function replaces all other methods for generating
|
||||
@@ -291,7 +291,8 @@ proc genAssignment(p: BProc, dest, src: TLoc, flags: TAssignmentFlags) =
|
||||
genRefAssign(p, dest, src, flags)
|
||||
else:
|
||||
linefmt(p, cpsStmts, "#genericSeqAssign($1, $2, $3);$n",
|
||||
addrLoc(dest), rdLoc(src), genTypeInfo(p.module, dest.t))
|
||||
addrLoc(dest), rdLoc(src),
|
||||
genTypeInfo(p.module, dest.t, dest.lode.info))
|
||||
of tyString:
|
||||
if (needToCopy notin flags and src.storage != OnStatic) or canMove(src.lode):
|
||||
genRefAssign(p, dest, src, flags)
|
||||
@@ -352,7 +353,8 @@ proc genAssignment(p: BProc, dest, src: TLoc, flags: TAssignmentFlags) =
|
||||
if needsComplexAssignment(dest.t):
|
||||
linefmt(p, cpsStmts, # XXX: is this correct for arrays?
|
||||
"#genericAssignOpenArray((void*)$1, (void*)$2, $1Len_0, $3);$n",
|
||||
addrLoc(dest), addrLoc(src), genTypeInfo(p.module, dest.t))
|
||||
addrLoc(dest), addrLoc(src),
|
||||
genTypeInfo(p.module, dest.t, dest.lode.info))
|
||||
else:
|
||||
useStringh(p.module)
|
||||
linefmt(p, cpsStmts,
|
||||
@@ -393,14 +395,17 @@ proc genDeepCopy(p: BProc; dest, src: TLoc) =
|
||||
of tyPtr, tyRef, tyProc, tyTuple, tyObject, tyArray:
|
||||
# XXX optimize this
|
||||
linefmt(p, cpsStmts, "#genericDeepCopy((void*)$1, (void*)$2, $3);$n",
|
||||
addrLoc(dest), addrLocOrTemp(src), genTypeInfo(p.module, dest.t))
|
||||
addrLoc(dest), addrLocOrTemp(src),
|
||||
genTypeInfo(p.module, dest.t, dest.lode.info))
|
||||
of tySequence, tyString:
|
||||
linefmt(p, cpsStmts, "#genericSeqDeepCopy($1, $2, $3);$n",
|
||||
addrLoc(dest), rdLoc(src), genTypeInfo(p.module, dest.t))
|
||||
addrLoc(dest), rdLoc(src),
|
||||
genTypeInfo(p.module, dest.t, dest.lode.info))
|
||||
of tyOpenArray, tyVarargs:
|
||||
linefmt(p, cpsStmts,
|
||||
"#genericDeepCopyOpenArray((void*)$1, (void*)$2, $1Len_0, $3);$n",
|
||||
addrLoc(dest), addrLocOrTemp(src), genTypeInfo(p.module, dest.t))
|
||||
addrLoc(dest), addrLocOrTemp(src),
|
||||
genTypeInfo(p.module, dest.t, dest.lode.info))
|
||||
of tySet:
|
||||
if mapType(ty) == ctArray:
|
||||
useStringh(p.module)
|
||||
@@ -678,9 +683,10 @@ proc genDeref(p: BProc, e: PNode, d: var TLoc; enforceDeref=false) =
|
||||
d.storage = OnHeap
|
||||
else:
|
||||
var a: TLoc
|
||||
var typ = skipTypes(e.sons[0].typ, abstractInst)
|
||||
var typ = e.sons[0].typ
|
||||
if typ.kind in {tyUserTypeClass, tyUserTypeClassInst} and typ.isResolvedUserTypeClass:
|
||||
typ = typ.lastSon
|
||||
typ = typ.skipTypes(abstractInst)
|
||||
if typ.kind == tyVar and tfVarIsPtr notin typ.flags and p.module.compileToCpp and e.sons[0].kind == nkHiddenAddr:
|
||||
initLocExprSingleUse(p, e[0][0], d)
|
||||
return
|
||||
@@ -793,8 +799,7 @@ proc genRecordField(p: BProc, e: PNode, d: var TLoc) =
|
||||
|
||||
proc genInExprAux(p: BProc, e: PNode, a, b, d: var TLoc)
|
||||
|
||||
proc genFieldCheck(p: BProc, e: PNode, obj: Rope, field: PSym;
|
||||
origTy: PType) =
|
||||
proc genFieldCheck(p: BProc, e: PNode, obj: Rope, field: PSym) =
|
||||
var test, u, v: TLoc
|
||||
for i in countup(1, sonsLen(e) - 1):
|
||||
var it = e.sons[i]
|
||||
@@ -806,12 +811,10 @@ proc genFieldCheck(p: BProc, e: PNode, obj: Rope, field: PSym;
|
||||
assert(disc.kind == nkSym)
|
||||
initLoc(test, locNone, it, OnStack)
|
||||
initLocExpr(p, it.sons[1], u)
|
||||
var o = obj
|
||||
let d = lookupFieldAgain(p, origTy, disc.sym, o)
|
||||
initLoc(v, locExpr, disc, OnUnknown)
|
||||
v.r = o
|
||||
v.r = obj
|
||||
v.r.add(".")
|
||||
v.r.add(d.loc.r)
|
||||
v.r.add(disc.sym.loc.r)
|
||||
genInExprAux(p, it, u, v, test)
|
||||
let id = nodeTableTestOrSet(p.module.dataCache,
|
||||
newStrNode(nkStrLit, field.name.s), p.module.labels)
|
||||
@@ -837,7 +840,7 @@ proc genCheckedRecordField(p: BProc, e: PNode, d: var TLoc) =
|
||||
if field.loc.r == nil: fillObjectFields(p.module, ty)
|
||||
if field.loc.r == nil:
|
||||
internalError(e.info, "genCheckedRecordField") # generate the checks:
|
||||
genFieldCheck(p, e, r, field, ty)
|
||||
genFieldCheck(p, e, r, field)
|
||||
add(r, rfmt(nil, ".$1", field.loc.r))
|
||||
putIntoDest(p, d, e.sons[0], r, a.storage)
|
||||
else:
|
||||
@@ -847,7 +850,7 @@ proc genArrayElem(p: BProc, n, x, y: PNode, d: var TLoc) =
|
||||
var a, b: TLoc
|
||||
initLocExpr(p, x, a)
|
||||
initLocExpr(p, y, b)
|
||||
var ty = skipTypes(skipTypes(a.t, abstractVarRange), abstractPtrs)
|
||||
var ty = skipTypes(a.t, abstractVarRange + abstractPtrs + tyUserTypeClasses)
|
||||
var first = intLiteral(firstOrd(ty))
|
||||
# emit range check:
|
||||
if optBoundsCheck in p.options and tfUncheckedArray notin ty.flags:
|
||||
@@ -965,23 +968,30 @@ proc genEcho(p: BProc, n: PNode) =
|
||||
# this unusal way of implementing it ensures that e.g. ``echo("hallo", 45)``
|
||||
# is threadsafe.
|
||||
internalAssert n.kind == nkBracket
|
||||
var args: Rope = nil
|
||||
var a: TLoc
|
||||
for i in countup(0, n.len-1):
|
||||
if n.sons[i].skipConv.kind == nkNilLit:
|
||||
add(args, ", \"nil\"")
|
||||
else:
|
||||
initLocExpr(p, n.sons[i], a)
|
||||
addf(args, ", $1? ($1)->data:\"nil\"", [rdLoc(a)])
|
||||
if platform.targetOS == osGenode:
|
||||
# bypass libc and print directly to the Genode LOG session
|
||||
var args: Rope = nil
|
||||
var a: TLoc
|
||||
for i in countup(0, n.len-1):
|
||||
if n.sons[i].skipConv.kind == nkNilLit:
|
||||
add(args, ", \"nil\"")
|
||||
else:
|
||||
initLocExpr(p, n.sons[i], a)
|
||||
addf(args, ", $1? ($1)->data:\"nil\"", [rdLoc(a)])
|
||||
p.module.includeHeader("<base/log.h>")
|
||||
linefmt(p, cpsStmts, """Genode::log(""$1);$n""", args)
|
||||
else:
|
||||
p.module.includeHeader("<stdio.h>")
|
||||
linefmt(p, cpsStmts, "printf($1$2);$n",
|
||||
makeCString(repeat("%s", n.len) & tnl), args)
|
||||
linefmt(p, cpsStmts, "fflush(stdout);$n")
|
||||
if n.len == 0:
|
||||
linefmt(p, cpsStmts, "#echoBinSafe(NIM_NIL, $1);$n", n.len.rope)
|
||||
else:
|
||||
var a: TLoc
|
||||
initLocExpr(p, n, a)
|
||||
linefmt(p, cpsStmts, "#echoBinSafe($1, $2);$n", a.rdLoc, n.len.rope)
|
||||
when false:
|
||||
p.module.includeHeader("<stdio.h>")
|
||||
linefmt(p, cpsStmts, "printf($1$2);$n",
|
||||
makeCString(repeat("%s", n.len) & tnl), args)
|
||||
linefmt(p, cpsStmts, "fflush(stdout);$n")
|
||||
|
||||
proc gcUsage(n: PNode) =
|
||||
if gSelectedGC == gcNone: message(n.info, warnGcMem, n.renderTree)
|
||||
@@ -1094,7 +1104,8 @@ proc genReset(p: BProc, n: PNode) =
|
||||
var a: TLoc
|
||||
initLocExpr(p, n.sons[1], a)
|
||||
linefmt(p, cpsStmts, "#genericReset((void*)$1, $2);$n",
|
||||
addrLoc(a), genTypeInfo(p.module, skipTypes(a.t, {tyVar})))
|
||||
addrLoc(a),
|
||||
genTypeInfo(p.module, skipTypes(a.t, {tyVar}), n.info))
|
||||
|
||||
proc rawGenNew(p: BProc, a: TLoc, sizeExpr: Rope) =
|
||||
var sizeExpr = sizeExpr
|
||||
@@ -1108,7 +1119,7 @@ proc rawGenNew(p: BProc, a: TLoc, sizeExpr: Rope) =
|
||||
sizeExpr = "sizeof($1)" %
|
||||
[getTypeDesc(p.module, bt)]
|
||||
let args = [getTypeDesc(p.module, typ),
|
||||
genTypeInfo(p.module, typ),
|
||||
genTypeInfo(p.module, typ, a.lode.info),
|
||||
sizeExpr]
|
||||
if a.storage == OnHeap and usesNativeGC():
|
||||
# use newObjRC1 as an optimization
|
||||
@@ -1138,7 +1149,7 @@ proc genNew(p: BProc, e: PNode) =
|
||||
proc genNewSeqAux(p: BProc, dest: TLoc, length: Rope) =
|
||||
let seqtype = skipTypes(dest.t, abstractVarRange)
|
||||
let args = [getTypeDesc(p.module, seqtype),
|
||||
genTypeInfo(p.module, seqtype), length]
|
||||
genTypeInfo(p.module, seqtype, dest.lode.info), length]
|
||||
var call: TLoc
|
||||
initLoc(call, locExpr, dest.lode, OnHeap)
|
||||
if dest.storage == OnHeap and usesNativeGC():
|
||||
@@ -1166,7 +1177,7 @@ proc genNewSeqOfCap(p: BProc; e: PNode; d: var TLoc) =
|
||||
putIntoDest(p, d, e, ropecg(p.module,
|
||||
"($1)#nimNewSeqOfCap($2, $3)", [
|
||||
getTypeDesc(p.module, seqtype),
|
||||
genTypeInfo(p.module, seqtype), a.rdLoc]))
|
||||
genTypeInfo(p.module, seqtype, e.info), a.rdLoc]))
|
||||
gcUsage(e)
|
||||
|
||||
proc genConstExpr(p: BProc, n: PNode): Rope
|
||||
@@ -1205,7 +1216,7 @@ proc genObjConstr(p: BProc, e: PNode, d: var TLoc) =
|
||||
constructLoc(p, tmp)
|
||||
discard getTypeDesc(p.module, t)
|
||||
let ty = getUniqueType(t)
|
||||
for i in 1 .. <e.len:
|
||||
for i in 1 ..< e.len:
|
||||
let it = e.sons[i]
|
||||
var tmp2: TLoc
|
||||
tmp2.r = r
|
||||
@@ -1213,7 +1224,7 @@ proc genObjConstr(p: BProc, e: PNode, d: var TLoc) =
|
||||
if field.loc.r == nil: fillObjectFields(p.module, ty)
|
||||
if field.loc.r == nil: internalError(e.info, "genObjConstr")
|
||||
if it.len == 3 and optFieldCheck in p.options:
|
||||
genFieldCheck(p, it.sons[2], r, field, ty)
|
||||
genFieldCheck(p, it.sons[2], r, field)
|
||||
add(tmp2.r, ".")
|
||||
add(tmp2.r, field.loc.r)
|
||||
tmp2.k = locTemp
|
||||
@@ -1226,18 +1237,32 @@ proc genObjConstr(p: BProc, e: PNode, d: var TLoc) =
|
||||
else:
|
||||
genAssignment(p, d, tmp, {})
|
||||
|
||||
proc lhsDoesAlias(a, b: PNode): bool =
|
||||
for y in b:
|
||||
if isPartOf(a, y) != arNo: return true
|
||||
|
||||
proc genSeqConstr(p: BProc, n: PNode, d: var TLoc) =
|
||||
var arr: TLoc
|
||||
if d.k == locNone:
|
||||
var arr, tmp: TLoc
|
||||
# bug #668
|
||||
let doesAlias = lhsDoesAlias(d.lode, n)
|
||||
let dest = if doesAlias: addr(tmp) else: addr(d)
|
||||
if doesAlias:
|
||||
getTemp(p, n.typ, tmp)
|
||||
elif d.k == locNone:
|
||||
getTemp(p, n.typ, d)
|
||||
# generate call to newSeq before adding the elements per hand:
|
||||
genNewSeqAux(p, d, intLiteral(sonsLen(n)))
|
||||
genNewSeqAux(p, dest[], intLiteral(sonsLen(n)))
|
||||
for i in countup(0, sonsLen(n) - 1):
|
||||
initLoc(arr, locExpr, n[i], OnHeap)
|
||||
arr.r = rfmt(nil, "$1->data[$2]", rdLoc(d), intLiteral(i))
|
||||
arr.r = rfmt(nil, "$1->data[$2]", rdLoc(dest[]), intLiteral(i))
|
||||
arr.storage = OnHeap # we know that sequences are on the heap
|
||||
expr(p, n[i], arr)
|
||||
gcUsage(n)
|
||||
if doesAlias:
|
||||
if d.k == locNone:
|
||||
d = tmp
|
||||
else:
|
||||
genAssignment(p, d, tmp, {})
|
||||
|
||||
proc genArrToSeq(p: BProc, n: PNode, d: var TLoc) =
|
||||
var elem, a, arr: TLoc
|
||||
@@ -1248,17 +1273,31 @@ proc genArrToSeq(p: BProc, n: PNode, d: var TLoc) =
|
||||
if d.k == locNone:
|
||||
getTemp(p, n.typ, d)
|
||||
# generate call to newSeq before adding the elements per hand:
|
||||
var L = int(lengthOrd(n.sons[1].typ))
|
||||
|
||||
let L = int(lengthOrd(n.sons[1].typ))
|
||||
genNewSeqAux(p, d, intLiteral(L))
|
||||
initLocExpr(p, n.sons[1], a)
|
||||
for i in countup(0, L - 1):
|
||||
# bug #5007; do not produce excessive C source code:
|
||||
if L < 10:
|
||||
for i in countup(0, L - 1):
|
||||
initLoc(elem, locExpr, lodeTyp elemType(skipTypes(n.typ, abstractInst)), OnHeap)
|
||||
elem.r = rfmt(nil, "$1->data[$2]", rdLoc(d), intLiteral(i))
|
||||
elem.storage = OnHeap # we know that sequences are on the heap
|
||||
initLoc(arr, locExpr, lodeTyp elemType(skipTypes(n.sons[1].typ, abstractInst)), a.storage)
|
||||
arr.r = rfmt(nil, "$1[$2]", rdLoc(a), intLiteral(i))
|
||||
genAssignment(p, elem, arr, {afDestIsNil, needToCopy})
|
||||
else:
|
||||
var i: TLoc
|
||||
getTemp(p, getSysType(tyInt), i)
|
||||
let oldCode = p.s(cpsStmts)
|
||||
linefmt(p, cpsStmts, "for ($1 = 0; $1 < $2; $1++) {$n", i.r, L.rope)
|
||||
initLoc(elem, locExpr, lodeTyp elemType(skipTypes(n.typ, abstractInst)), OnHeap)
|
||||
elem.r = rfmt(nil, "$1->data[$2]", rdLoc(d), intLiteral(i))
|
||||
elem.r = rfmt(nil, "$1->data[$2]", rdLoc(d), rdLoc(i))
|
||||
elem.storage = OnHeap # we know that sequences are on the heap
|
||||
initLoc(arr, locExpr, lodeTyp elemType(skipTypes(n.sons[1].typ, abstractInst)), a.storage)
|
||||
arr.r = rfmt(nil, "$1[$2]", rdLoc(a), intLiteral(i))
|
||||
arr.r = rfmt(nil, "$1[$2]", rdLoc(a), rdLoc(i))
|
||||
genAssignment(p, elem, arr, {afDestIsNil, needToCopy})
|
||||
lineF(p, cpsStmts, "}$n", [])
|
||||
|
||||
|
||||
proc genNewFinalize(p: BProc, e: PNode) =
|
||||
var
|
||||
@@ -1269,7 +1308,7 @@ proc genNewFinalize(p: BProc, e: PNode) =
|
||||
initLocExpr(p, e.sons[1], a)
|
||||
initLocExpr(p, e.sons[2], f)
|
||||
initLoc(b, locExpr, a.lode, OnHeap)
|
||||
ti = genTypeInfo(p.module, refType)
|
||||
ti = genTypeInfo(p.module, refType, e.info)
|
||||
addf(p.module.s[cfsTypeInit3], "$1->finalizer = (void*)$2;$n", [ti, rdLoc(f)])
|
||||
b.r = ropecg(p.module, "($1) #newObj($2, sizeof($3))", [
|
||||
getTypeDesc(p.module, refType),
|
||||
@@ -1279,10 +1318,10 @@ proc genNewFinalize(p: BProc, e: PNode) =
|
||||
genObjectInit(p, cpsStmts, bt, a, false)
|
||||
gcUsage(e)
|
||||
|
||||
proc genOfHelper(p: BProc; dest: PType; a: Rope): Rope =
|
||||
proc genOfHelper(p: BProc; dest: PType; a: Rope; info: TLineInfo): Rope =
|
||||
# unfortunately 'genTypeInfo' sets tfObjHasKids as a side effect, so we
|
||||
# have to call it here first:
|
||||
let ti = genTypeInfo(p.module, dest)
|
||||
let ti = genTypeInfo(p.module, dest, info)
|
||||
if tfFinal in dest.flags or (objHasKidsValid in p.module.flags and
|
||||
tfObjHasKids notin dest.flags):
|
||||
result = "$1.m_type == $2" % [a, ti]
|
||||
@@ -1295,7 +1334,7 @@ proc genOfHelper(p: BProc; dest: PType; a: Rope): Rope =
|
||||
when false:
|
||||
# former version:
|
||||
result = rfmt(p.module, "#isObj($1.m_type, $2)",
|
||||
a, genTypeInfo(p.module, dest))
|
||||
a, genTypeInfo(p.module, dest, info))
|
||||
|
||||
proc genOf(p: BProc, x: PNode, typ: PType, d: var TLoc) =
|
||||
var a: TLoc
|
||||
@@ -1317,9 +1356,9 @@ proc genOf(p: BProc, x: PNode, typ: PType, d: var TLoc) =
|
||||
globalError(x.info, errGenerated,
|
||||
"no 'of' operator available for pure objects")
|
||||
if nilCheck != nil:
|
||||
r = rfmt(p.module, "(($1) && ($2))", nilCheck, genOfHelper(p, dest, r))
|
||||
r = rfmt(p.module, "(($1) && ($2))", nilCheck, genOfHelper(p, dest, r, x.info))
|
||||
else:
|
||||
r = rfmt(p.module, "($1)", genOfHelper(p, dest, r))
|
||||
r = rfmt(p.module, "($1)", genOfHelper(p, dest, r, x.info))
|
||||
putIntoDest(p, d, x, r, a.storage)
|
||||
|
||||
proc genOf(p: BProc, n: PNode, d: var TLoc) =
|
||||
@@ -1342,12 +1381,12 @@ proc genRepr(p: BProc, e: PNode, d: var TLoc) =
|
||||
of tyEnum, tyOrdinal:
|
||||
putIntoDest(p, d, e,
|
||||
ropecg(p.module, "#reprEnum((NI)$1, $2)", [
|
||||
rdLoc(a), genTypeInfo(p.module, t)]), a.storage)
|
||||
rdLoc(a), genTypeInfo(p.module, t, e.info)]), a.storage)
|
||||
of tyString:
|
||||
putIntoDest(p, d, e, ropecg(p.module, "#reprStr($1)", [rdLoc(a)]), a.storage)
|
||||
of tySet:
|
||||
putIntoDest(p, d, e, ropecg(p.module, "#reprSet($1, $2)", [
|
||||
addrLoc(a), genTypeInfo(p.module, t)]), a.storage)
|
||||
addrLoc(a), genTypeInfo(p.module, t, e.info)]), a.storage)
|
||||
of tyOpenArray, tyVarargs:
|
||||
var b: TLoc
|
||||
case a.t.kind
|
||||
@@ -1362,22 +1401,22 @@ proc genRepr(p: BProc, e: PNode, d: var TLoc) =
|
||||
else: internalError(e.sons[0].info, "genRepr()")
|
||||
putIntoDest(p, d, e,
|
||||
ropecg(p.module, "#reprOpenArray($1, $2)", [rdLoc(b),
|
||||
genTypeInfo(p.module, elemType(t))]), a.storage)
|
||||
genTypeInfo(p.module, elemType(t), e.info)]), a.storage)
|
||||
of tyCString, tyArray, tyRef, tyPtr, tyPointer, tyNil, tySequence:
|
||||
putIntoDest(p, d, e,
|
||||
ropecg(p.module, "#reprAny($1, $2)", [
|
||||
rdLoc(a), genTypeInfo(p.module, t)]), a.storage)
|
||||
rdLoc(a), genTypeInfo(p.module, t, e.info)]), a.storage)
|
||||
of tyEmpty, tyVoid:
|
||||
localError(e.info, "'repr' doesn't support 'void' type")
|
||||
else:
|
||||
putIntoDest(p, d, e, ropecg(p.module, "#reprAny($1, $2)",
|
||||
[addrLoc(a), genTypeInfo(p.module, t)]),
|
||||
[addrLoc(a), genTypeInfo(p.module, t, e.info)]),
|
||||
a.storage)
|
||||
gcUsage(e)
|
||||
|
||||
proc genGetTypeInfo(p: BProc, e: PNode, d: var TLoc) =
|
||||
let t = e.sons[1].typ
|
||||
putIntoDest(p, d, e, genTypeInfo(p.module, t))
|
||||
putIntoDest(p, d, e, genTypeInfo(p.module, t, e.info))
|
||||
|
||||
proc genDollar(p: BProc, n: PNode, d: var TLoc, frmt: string) =
|
||||
var a: TLoc
|
||||
@@ -1618,8 +1657,14 @@ proc genSomeCast(p: BProc, e: PNode, d: var TLoc) =
|
||||
putIntoDest(p, d, e, "(($1) ($2))" %
|
||||
[getClosureType(p.module, etyp, clHalfWithEnv), rdCharLoc(a)], a.storage)
|
||||
else:
|
||||
putIntoDest(p, d, e, "(($1) ($2))" %
|
||||
[getTypeDesc(p.module, e.typ), rdCharLoc(a)], a.storage)
|
||||
let srcTyp = skipTypes(e.sons[1].typ, abstractRange)
|
||||
# C++ does not like direct casts from pointer to shorter integral types
|
||||
if srcTyp.kind in {tyPtr, tyPointer} and etyp.kind in IntegralTypes:
|
||||
putIntoDest(p, d, e, "(($1) (ptrdiff_t) ($2))" %
|
||||
[getTypeDesc(p.module, e.typ), rdCharLoc(a)], a.storage)
|
||||
else:
|
||||
putIntoDest(p, d, e, "(($1) ($2))" %
|
||||
[getTypeDesc(p.module, e.typ), rdCharLoc(a)], a.storage)
|
||||
|
||||
proc genCast(p: BProc, e: PNode, d: var TLoc) =
|
||||
const ValueTypes = {tyFloat..tyFloat128, tyTuple, tyObject, tyArray}
|
||||
@@ -1663,7 +1708,7 @@ proc genRangeChck(p: BProc, n: PNode, d: var TLoc, magic: string) =
|
||||
|
||||
proc genConv(p: BProc, e: PNode, d: var TLoc) =
|
||||
let destType = e.typ.skipTypes({tyVar, tyGenericInst, tyAlias})
|
||||
if compareTypes(destType, e.sons[1].typ, dcEqIgnoreDistinct):
|
||||
if sameBackendType(destType, e.sons[1].typ):
|
||||
expr(p, e.sons[1], d)
|
||||
else:
|
||||
genSomeCast(p, e, d)
|
||||
@@ -1830,7 +1875,10 @@ proc genMagicExpr(p: BProc, e: PNode, d: var TLoc, op: TMagic) =
|
||||
initLocExpr(p, e.sons[2], b)
|
||||
genDeepCopy(p, a, b)
|
||||
of mDotDot, mEqCString: genCall(p, e, d)
|
||||
else: internalError(e.info, "genMagicExpr: " & $op)
|
||||
else:
|
||||
when defined(debugMagics):
|
||||
echo p.prc.name.s, " ", p.prc.id, " ", p.prc.flags, " ", p.prc.ast[genericParamsPos].kind
|
||||
internalError(e.info, "genMagicExpr: " & $op)
|
||||
|
||||
proc genSetConstr(p: BProc, e: PNode, d: var TLoc) =
|
||||
# example: { a..b, c, d, e, f..g }
|
||||
@@ -1935,10 +1983,35 @@ proc genComplexConst(p: BProc, sym: PSym, d: var TLoc) =
|
||||
assert((sym.loc.r != nil) and (sym.loc.t != nil))
|
||||
putLocIntoDest(p, d, sym.loc)
|
||||
|
||||
template genStmtListExprImpl(exprOrStmt) {.dirty.} =
|
||||
#let hasNimFrame = magicsys.getCompilerProc("nimFrame") != nil
|
||||
let hasNimFrame = p.prc != nil and
|
||||
sfSystemModule notin p.module.module.flags and
|
||||
optStackTrace in p.prc.options
|
||||
var frameName: Rope = nil
|
||||
for i in 0 .. n.len - 2:
|
||||
let it = n[i]
|
||||
if it.kind == nkComesFrom:
|
||||
if hasNimFrame and frameName == nil:
|
||||
inc p.labels
|
||||
frameName = "FR" & rope(p.labels) & "_"
|
||||
let theMacro = it[0].sym
|
||||
add p.s(cpsStmts), initFrameNoDebug(p, frameName,
|
||||
makeCString theMacro.name.s,
|
||||
theMacro.info.quotedFilename, it.info.line)
|
||||
else:
|
||||
genStmts(p, it)
|
||||
if n.len > 0: exprOrStmt
|
||||
if frameName != nil:
|
||||
add p.s(cpsStmts), deinitFrameNoDebug(p, frameName)
|
||||
|
||||
proc genStmtListExpr(p: BProc, n: PNode, d: var TLoc) =
|
||||
var length = sonsLen(n)
|
||||
for i in countup(0, length - 2): genStmts(p, n.sons[i])
|
||||
if length > 0: expr(p, n.sons[length - 1], d)
|
||||
genStmtListExprImpl:
|
||||
expr(p, n[n.len - 1], d)
|
||||
|
||||
proc genStmtList(p: BProc, n: PNode) =
|
||||
genStmtListExprImpl:
|
||||
genStmts(p, n[n.len - 1])
|
||||
|
||||
proc upConv(p: BProc, n: PNode, d: var TLoc) =
|
||||
var a: TLoc
|
||||
@@ -1959,10 +2032,10 @@ proc upConv(p: BProc, n: PNode, d: var TLoc) =
|
||||
t = skipTypes(t.sons[0], skipPtrs)
|
||||
if nilCheck != nil:
|
||||
linefmt(p, cpsStmts, "if ($1) #chckObj($2.m_type, $3);$n",
|
||||
nilCheck, r, genTypeInfo(p.module, dest))
|
||||
nilCheck, r, genTypeInfo(p.module, dest, n.info))
|
||||
else:
|
||||
linefmt(p, cpsStmts, "#chckObj($1.m_type, $2);$n",
|
||||
r, genTypeInfo(p.module, dest))
|
||||
r, genTypeInfo(p.module, dest, n.info))
|
||||
if n.sons[0].typ.kind != tyObject:
|
||||
putIntoDest(p, d, n,
|
||||
"(($1) ($2))" % [getTypeDesc(p.module, n.typ), rdLoc(a)], a.storage)
|
||||
@@ -2137,8 +2210,7 @@ proc expr(p: BProc, n: PNode, d: var TLoc) =
|
||||
of nkCheckedFieldExpr: genCheckedRecordField(p, n, d)
|
||||
of nkBlockExpr, nkBlockStmt: genBlock(p, n, d)
|
||||
of nkStmtListExpr: genStmtListExpr(p, n, d)
|
||||
of nkStmtList:
|
||||
for i in countup(0, sonsLen(n) - 1): genStmts(p, n.sons[i])
|
||||
of nkStmtList: genStmtList(p, n)
|
||||
of nkIfExpr, nkIfStmt: genIf(p, n, d)
|
||||
of nkWhen:
|
||||
# This should be a "when nimvm" node.
|
||||
@@ -2245,10 +2317,16 @@ proc getDefaultValue(p: BProc; typ: PType; info: TLineInfo): Rope =
|
||||
result = rope"{NIM_NIL, NIM_NIL}"
|
||||
of tyObject:
|
||||
if not isObjLackingTypeField(t) and not p.module.compileToCpp:
|
||||
result = "{{$1}}" % [genTypeInfo(p.module, t)]
|
||||
result = "{{$1}}" % [genTypeInfo(p.module, t, info)]
|
||||
else:
|
||||
result = rope"{}"
|
||||
of tyArray, tyTuple: result = rope"{}"
|
||||
of tyTuple:
|
||||
result = rope"{"
|
||||
for i in 0 ..< typ.len:
|
||||
if i > 0: result.add ", "
|
||||
result.add getDefaultValue(p, typ.sons[i], info)
|
||||
result.add "}"
|
||||
of tyArray: result = rope"{}"
|
||||
of tySet:
|
||||
if mapType(t) == ctArray: result = rope"{}"
|
||||
else: result = rope"0"
|
||||
@@ -2290,7 +2368,7 @@ proc getNullValueAuxT(p: BProc; orig, t: PType; obj, cons: PNode, result: var Ro
|
||||
base = skipTypes(base, skipPtrs)
|
||||
getNullValueAuxT(p, orig, base, base.n, cons, result, count)
|
||||
elif not isObjLackingTypeField(t) and not p.module.compileToCpp:
|
||||
addf(result, "$1", [genTypeInfo(p.module, orig)])
|
||||
addf(result, "$1", [genTypeInfo(p.module, orig, obj.info)])
|
||||
inc count
|
||||
getNullValueAux(p, t, obj, cons, result, count)
|
||||
# do not emit '{}' as that is not valid C:
|
||||
|
||||
@@ -96,7 +96,7 @@ proc writeIntSet(a: IntSet, s: var string) =
|
||||
s.add('}')
|
||||
|
||||
proc genMergeInfo*(m: BModule): Rope =
|
||||
if optSymbolFiles notin gGlobalOptions: return nil
|
||||
if not compilationCachePresent: return nil
|
||||
var s = "/*\tNIM_merge_INFO:"
|
||||
s.add(tnl)
|
||||
s.add("typeCache:{")
|
||||
|
||||
@@ -20,7 +20,7 @@ proc registerGcRoot(p: BProc, v: PSym) =
|
||||
containsGarbageCollectedRef(v.loc.t):
|
||||
# we register a specialized marked proc here; this has the advantage
|
||||
# that it works out of the box for thread local storage then :-)
|
||||
let prc = genTraverseProcForGlobal(p.module, v)
|
||||
let prc = genTraverseProcForGlobal(p.module, v, v.info)
|
||||
appcg(p.module, p.module.initProc.procSec(cpsInit),
|
||||
"#nimRegisterGlobalMarker($1);$n", [prc])
|
||||
|
||||
@@ -165,7 +165,7 @@ proc genGotoState(p: BProc, n: PNode) =
|
||||
statesCounter = n[1].intVal
|
||||
let prefix = if n.len == 3 and n[2].kind == nkStrLit: n[2].strVal.rope
|
||||
else: rope"STATE"
|
||||
for i in 0 .. statesCounter:
|
||||
for i in 0i64 .. statesCounter:
|
||||
lineF(p, cpsStmts, "case $2: goto $1$2;$n", [prefix, rope(i)])
|
||||
lineF(p, cpsStmts, "}$n", [])
|
||||
|
||||
@@ -235,7 +235,7 @@ proc genSingleVar(p: BProc, a: PNode) =
|
||||
var params: Rope
|
||||
let typ = skipTypes(value.sons[0].typ, abstractInst)
|
||||
assert(typ.kind == tyProc)
|
||||
for i in 1.. <value.len:
|
||||
for i in 1..<value.len:
|
||||
if params != nil: params.add(~", ")
|
||||
assert(sonsLen(typ) == sonsLen(typ.n))
|
||||
add(params, genOtherArg(p, value, i, typ))
|
||||
@@ -386,7 +386,7 @@ proc genReturnStmt(p: BProc, t: PNode) =
|
||||
lineF(p, cpsStmts, "goto BeforeRet_;$n", [])
|
||||
|
||||
proc genGotoForCase(p: BProc; caseStmt: PNode) =
|
||||
for i in 1 .. <caseStmt.len:
|
||||
for i in 1 ..< caseStmt.len:
|
||||
startBlock(p)
|
||||
let it = caseStmt.sons[i]
|
||||
for j in 0 .. it.len-2:
|
||||
@@ -402,7 +402,7 @@ proc genComputedGoto(p: BProc; n: PNode) =
|
||||
# first pass: Generate array of computed labels:
|
||||
var casePos = -1
|
||||
var arraySize: int
|
||||
for i in 0 .. <n.len:
|
||||
for i in 0 ..< n.len:
|
||||
let it = n.sons[i]
|
||||
if it.kind == nkCaseStmt:
|
||||
if lastSon(it).kind != nkOfBranch:
|
||||
@@ -432,7 +432,7 @@ proc genComputedGoto(p: BProc; n: PNode) =
|
||||
let oldBody = p.blocks[topBlock].sections[cpsStmts]
|
||||
p.blocks[topBlock].sections[cpsStmts] = nil
|
||||
|
||||
for j in casePos+1 .. <n.len: genStmts(p, n.sons[j])
|
||||
for j in casePos+1 ..< n.len: genStmts(p, n.sons[j])
|
||||
let tailB = p.blocks[topBlock].sections[cpsStmts]
|
||||
|
||||
p.blocks[topBlock].sections[cpsStmts] = nil
|
||||
@@ -447,7 +447,7 @@ proc genComputedGoto(p: BProc; n: PNode) =
|
||||
# first goto:
|
||||
lineF(p, cpsStmts, "goto *$#[$#];$n", [tmp, a.rdLoc])
|
||||
|
||||
for i in 1 .. <caseStmt.len:
|
||||
for i in 1 ..< caseStmt.len:
|
||||
startBlock(p)
|
||||
let it = caseStmt.sons[i]
|
||||
for j in 0 .. it.len-2:
|
||||
@@ -457,7 +457,7 @@ proc genComputedGoto(p: BProc; n: PNode) =
|
||||
let val = getOrdValue(it.sons[j])
|
||||
lineF(p, cpsStmts, "TMP$#_:$n", [intLiteral(val+id+1)])
|
||||
genStmts(p, it.lastSon)
|
||||
#for j in casePos+1 .. <n.len: genStmts(p, n.sons[j]) # tailB
|
||||
#for j in casePos+1 ..< n.len: genStmts(p, n.sons[j]) # tailB
|
||||
#for j in 0 .. casePos-1: genStmts(p, n.sons[j]) # tailA
|
||||
add(p.s(cpsStmts), tailB)
|
||||
add(p.s(cpsStmts), tailA)
|
||||
@@ -564,9 +564,6 @@ proc genBreakStmt(p: BProc, t: PNode) =
|
||||
genLineDir(p, t)
|
||||
lineF(p, cpsStmts, "goto $1;$n", [label])
|
||||
|
||||
proc getRaiseFrmt(p: BProc): string =
|
||||
result = "#raiseException((#Exception*)$1, $2);$n"
|
||||
|
||||
proc genRaiseStmt(p: BProc, t: PNode) =
|
||||
if p.inExceptBlock > 0:
|
||||
# if the current try stmt have a finally block,
|
||||
@@ -580,7 +577,8 @@ proc genRaiseStmt(p: BProc, t: PNode) =
|
||||
var e = rdLoc(a)
|
||||
var typ = skipTypes(t.sons[0].typ, abstractPtrs)
|
||||
genLineDir(p, t)
|
||||
lineCg(p, cpsStmts, getRaiseFrmt(p), [e, makeCString(typ.sym.name.s)])
|
||||
lineCg(p, cpsStmts, "#raiseException((#Exception*)$1, $2);$n",
|
||||
[e, makeCString(typ.sym.name.s)])
|
||||
else:
|
||||
genLineDir(p, t)
|
||||
# reraise the last exception:
|
||||
@@ -744,7 +742,7 @@ proc genOrdinalCase(p: BProc, n: PNode, d: var TLoc) =
|
||||
if splitPoint+1 < n.len:
|
||||
lineF(p, cpsStmts, "switch ($1) {$n", [rdCharLoc(a)])
|
||||
var hasDefault = false
|
||||
for i in splitPoint+1 .. < n.len:
|
||||
for i in splitPoint+1 ..< n.len:
|
||||
# bug #4230: avoid false sharing between branches:
|
||||
if d.k == locTemp and isEmptyType(n.typ): d.k = locNone
|
||||
var branch = n[i]
|
||||
@@ -835,7 +833,7 @@ proc genTryCpp(p: BProc, t: PNode, d: var TLoc) =
|
||||
if orExpr != nil: add(orExpr, "||")
|
||||
appcg(p.module, orExpr,
|
||||
"#isObj($1.exp->m_type, $2)",
|
||||
[exc, genTypeInfo(p.module, t.sons[i].sons[j].typ)])
|
||||
[exc, genTypeInfo(p.module, t[i][j].typ, t[i][j].info)])
|
||||
lineF(p, cpsStmts, "if ($1) ", [orExpr])
|
||||
startBlock(p)
|
||||
expr(p, t.sons[i].sons[blen-1], d)
|
||||
@@ -944,7 +942,7 @@ proc genTry(p: BProc, t: PNode, d: var TLoc) =
|
||||
"#isObj(#getCurrentException()->Sup.m_type, $1)"
|
||||
else: "#isObj(#getCurrentException()->m_type, $1)"
|
||||
appcg(p.module, orExpr, isObjFormat,
|
||||
[genTypeInfo(p.module, t.sons[i].sons[j].typ)])
|
||||
[genTypeInfo(p.module, t[i][j].typ, t[i][j].info)])
|
||||
if i > 1: line(p, cpsStmts, "else ")
|
||||
startBlock(p, "if ($1) {$n", [orExpr])
|
||||
linefmt(p, cpsStmts, "$1.status = 0;$n", safePoint)
|
||||
@@ -1062,7 +1060,7 @@ proc genWatchpoint(p: BProc, n: PNode) =
|
||||
let typ = skipTypes(n.sons[1].typ, abstractVarRange)
|
||||
lineCg(p, cpsStmts, "#dbgRegisterWatchpoint($1, (NCSTRING)$2, $3);$n",
|
||||
[a.addrLoc, makeCString(renderTree(n.sons[1])),
|
||||
genTypeInfo(p.module, typ)])
|
||||
genTypeInfo(p.module, typ, n.info)])
|
||||
|
||||
proc genPragma(p: BProc, n: PNode) =
|
||||
for i in countup(0, sonsLen(n) - 1):
|
||||
@@ -1092,7 +1090,7 @@ proc genDiscriminantCheck(p: BProc, a, tmp: TLoc, objtype: PType,
|
||||
field: PSym) =
|
||||
var t = skipTypes(objtype, abstractVar)
|
||||
assert t.kind == tyObject
|
||||
discard genTypeInfo(p.module, t)
|
||||
discard genTypeInfo(p.module, t, a.lode.info)
|
||||
var L = lengthOrd(field.typ)
|
||||
if not containsOrIncl(p.module.declaredThings, field.id):
|
||||
appcg(p.module, cfsVars, "extern $1",
|
||||
@@ -1112,19 +1110,46 @@ proc asgnFieldDiscriminant(p: BProc, e: PNode) =
|
||||
genDiscriminantCheck(p, a, tmp, dotExpr.sons[0].typ, dotExpr.sons[1].sym)
|
||||
genAssignment(p, a, tmp, {})
|
||||
|
||||
proc patchAsgnStmtListExpr(father, orig, n: PNode) =
|
||||
case n.kind
|
||||
of nkDerefExpr, nkHiddenDeref:
|
||||
let asgn = copyNode(orig)
|
||||
asgn.add orig[0]
|
||||
asgn.add n
|
||||
father.add asgn
|
||||
of nkStmtList, nkStmtListExpr:
|
||||
for x in n:
|
||||
patchAsgnStmtListExpr(father, orig, x)
|
||||
else:
|
||||
father.add n
|
||||
|
||||
proc genAsgn(p: BProc, e: PNode, fastAsgn: bool) =
|
||||
if e.sons[0].kind == nkSym and sfGoto in e.sons[0].sym.flags:
|
||||
genLineDir(p, e)
|
||||
genGotoVar(p, e.sons[1])
|
||||
elif not fieldDiscriminantCheckNeeded(p, e):
|
||||
# this fixes bug #6422 but we really need to change the representation of
|
||||
# arrays in the backend...
|
||||
let le = e[0]
|
||||
let ri = e[1]
|
||||
var needsRepair = false
|
||||
var it = ri
|
||||
while it.kind in {nkStmtList, nkStmtListExpr}:
|
||||
it = it.lastSon
|
||||
needsRepair = true
|
||||
if it.kind in {nkDerefExpr, nkHiddenDeref} and needsRepair:
|
||||
var patchedTree = newNodeI(nkStmtList, e.info)
|
||||
patchAsgnStmtListExpr(patchedTree, e, ri)
|
||||
genStmts(p, patchedTree)
|
||||
return
|
||||
|
||||
var a: TLoc
|
||||
if e[0].kind in {nkDerefExpr, nkHiddenDeref}:
|
||||
genDeref(p, e[0], a, enforceDeref=true)
|
||||
if le.kind in {nkDerefExpr, nkHiddenDeref}:
|
||||
genDeref(p, le, a, enforceDeref=true)
|
||||
else:
|
||||
initLocExpr(p, e.sons[0], a)
|
||||
initLocExpr(p, le, a)
|
||||
if fastAsgn: incl(a.flags, lfNoDeepCopy)
|
||||
assert(a.t != nil)
|
||||
let ri = e.sons[1]
|
||||
genLineDir(p, ri)
|
||||
loadInto(p, e.sons[0], ri, a)
|
||||
else:
|
||||
|
||||
@@ -66,7 +66,7 @@ proc genTraverseProc(c: var TTraversalClosure, accessor: Rope, typ: PType) =
|
||||
|
||||
var p = c.p
|
||||
case typ.kind
|
||||
of tyGenericInst, tyGenericBody, tyTypeDesc, tyAlias, tyDistinct:
|
||||
of tyGenericInst, tyGenericBody, tyTypeDesc, tyAlias, tyDistinct, tyInferred:
|
||||
genTraverseProc(c, accessor, lastSon(typ))
|
||||
of tyArray:
|
||||
let arraySize = lengthOrd(typ.sons[0])
|
||||
@@ -151,8 +151,8 @@ proc genTraverseProc(m: BModule, origTyp: PType; sig: SigHash;
|
||||
m.s[cfsProcHeaders].addf("$1;$n", [header])
|
||||
m.s[cfsProcs].add(generatedProc)
|
||||
|
||||
proc genTraverseProcForGlobal(m: BModule, s: PSym): Rope =
|
||||
discard genTypeInfo(m, s.loc.t)
|
||||
proc genTraverseProcForGlobal(m: BModule, s: PSym; info: TLineInfo): Rope =
|
||||
discard genTypeInfo(m, s.loc.t, info)
|
||||
|
||||
var c: TTraversalClosure
|
||||
var p = newProc(nil, m)
|
||||
|
||||
@@ -119,7 +119,7 @@ proc scopeMangledParam(p: BProc; param: PSym) =
|
||||
|
||||
const
|
||||
irrelevantForBackend = {tyGenericBody, tyGenericInst, tyGenericInvocation,
|
||||
tyDistinct, tyRange, tyStatic, tyAlias}
|
||||
tyDistinct, tyRange, tyStatic, tyAlias, tyInferred}
|
||||
|
||||
proc typeName(typ: PType): Rope =
|
||||
let typ = typ.skipTypes(irrelevantForBackend)
|
||||
@@ -278,7 +278,10 @@ proc ccgIntroducedPtr(s: PSym): bool =
|
||||
elif tfByCopy in pt.flags: return false
|
||||
case pt.kind
|
||||
of tyObject:
|
||||
if (optByRef in s.options) or (getSize(pt) > platform.floatSize * 3):
|
||||
if s.typ.sym != nil and sfForward in s.typ.sym.flags:
|
||||
# forwarded objects are *always* passed by pointers for consistency!
|
||||
result = true
|
||||
elif (optByRef in s.options) or (getSize(pt) > platform.floatSize * 3):
|
||||
result = true # requested anyway
|
||||
elif (tfFinal in pt.flags) and (pt.sons[0] == nil):
|
||||
result = false # no need, because no subtyping possible
|
||||
@@ -806,7 +809,7 @@ proc getTypeDescAux(m: BModule, origTyp: PType, check: var IntSet): Rope =
|
||||
var chunkStart = 0
|
||||
while i < cppName.data.len:
|
||||
if cppName.data[i] == '\'':
|
||||
var chunkEnd = <i
|
||||
var chunkEnd = i-1
|
||||
var idx, stars: int
|
||||
if scanCppGenericSlot(cppName.data, i, idx, stars):
|
||||
result.add cppName.data.substr(chunkStart, chunkEnd)
|
||||
@@ -854,11 +857,12 @@ proc getTypeDescAux(m: BModule, origTyp: PType, check: var IntSet): Rope =
|
||||
[structOrUnion(t), result])
|
||||
assert m.forwTypeCache[sig] == result
|
||||
m.typeCache[sig] = result # always call for sideeffects:
|
||||
let recdesc = if t.kind != tyTuple: getRecordDesc(m, t, result, check)
|
||||
else: getTupleDesc(m, t, result, check)
|
||||
if not isImportedType(t):
|
||||
add(m.s[cfsTypes], recdesc)
|
||||
elif tfIncompleteStruct notin t.flags: addAbiCheck(m, t, result)
|
||||
if not incompleteType(t):
|
||||
let recdesc = if t.kind != tyTuple: getRecordDesc(m, t, result, check)
|
||||
else: getTupleDesc(m, t, result, check)
|
||||
if not isImportedType(t):
|
||||
add(m.s[cfsTypes], recdesc)
|
||||
elif tfIncompleteStruct notin t.flags: addAbiCheck(m, t, result)
|
||||
of tySet:
|
||||
result = $t.kind & '_' & getTypeName(m, t.lastSon, hashType t.lastSon)
|
||||
m.typeCache[sig] = result
|
||||
@@ -921,6 +925,8 @@ proc genProcHeader(m: BModule, prc: PSym): Rope =
|
||||
result.add "N_LIB_EXPORT "
|
||||
elif prc.typ.callConv == ccInline:
|
||||
result.add "static "
|
||||
elif {sfImportc, sfExportc} * prc.flags == {}:
|
||||
result.add "N_LIB_PRIVATE "
|
||||
var check = initIntSet()
|
||||
fillLoc(prc.loc, locProc, prc.ast[namePos], mangleName(m, prc), OnUnknown)
|
||||
genProcParams(m, prc.typ, rettype, params, check)
|
||||
@@ -935,12 +941,13 @@ proc genProcHeader(m: BModule, prc: PSym): Rope =
|
||||
|
||||
# ------------------ type info generation -------------------------------------
|
||||
|
||||
proc genTypeInfo(m: BModule, t: PType): Rope
|
||||
proc genTypeInfo(m: BModule, t: PType; info: TLineInfo): Rope
|
||||
proc getNimNode(m: BModule): Rope =
|
||||
result = "$1[$2]" % [m.typeNodesName, rope(m.typeNodes)]
|
||||
inc(m.typeNodes)
|
||||
|
||||
proc genTypeInfoAuxBase(m: BModule; typ, origType: PType; name, base: Rope) =
|
||||
proc genTypeInfoAuxBase(m: BModule; typ, origType: PType;
|
||||
name, base: Rope; info: TLineInfo) =
|
||||
var nimtypeKind: int
|
||||
#allocMemTI(m, typ, name)
|
||||
if isObjLackingTypeField(typ):
|
||||
@@ -963,22 +970,29 @@ proc genTypeInfoAuxBase(m: BModule; typ, origType: PType; name, base: Rope) =
|
||||
addf(m.s[cfsTypeInit3], "$1.flags = $2;$n", [name, rope(flags)])
|
||||
discard cgsym(m, "TNimType")
|
||||
if isDefined("nimTypeNames"):
|
||||
var typename = typeToString(origType, preferName)
|
||||
if typename == "ref object" and origType.skipTypes(skipPtrs).sym != nil:
|
||||
typename = "anon ref object from " & $origType.skipTypes(skipPtrs).sym.info
|
||||
addf(m.s[cfsTypeInit3], "$1.name = $2;$n",
|
||||
[name, makeCstring typeToString(origType, preferName)])
|
||||
[name, makeCstring typename])
|
||||
discard cgsym(m, "nimTypeRoot")
|
||||
addf(m.s[cfsTypeInit3], "$1.nextType = nimTypeRoot; nimTypeRoot=&$1;$n",
|
||||
[name])
|
||||
addf(m.s[cfsVars], "TNimType $1;$n", [name])
|
||||
|
||||
proc genTypeInfoAux(m: BModule, typ, origType: PType, name: Rope) =
|
||||
proc genTypeInfoAux(m: BModule, typ, origType: PType, name: Rope;
|
||||
info: TLineInfo) =
|
||||
var base: Rope
|
||||
if sonsLen(typ) > 0 and typ.lastSon != nil:
|
||||
var x = typ.lastSon
|
||||
if typ.kind == tyObject: x = x.skipTypes(skipPtrs)
|
||||
base = genTypeInfo(m, x)
|
||||
if typ.kind == tyPtr and x.kind == tyObject and incompleteType(x):
|
||||
base = rope("0")
|
||||
else:
|
||||
base = genTypeInfo(m, x, info)
|
||||
else:
|
||||
base = rope("0")
|
||||
genTypeInfoAuxBase(m, typ, origType, name, base)
|
||||
genTypeInfoAuxBase(m, typ, origType, name, base, info)
|
||||
|
||||
proc discriminatorTableName(m: BModule, objtype: PType, d: PSym): Rope =
|
||||
# bugfix: we need to search the type that contains the discriminator:
|
||||
@@ -994,19 +1008,20 @@ proc discriminatorTableDecl(m: BModule, objtype: PType, d: PSym): Rope =
|
||||
var tmp = discriminatorTableName(m, objtype, d)
|
||||
result = "TNimNode* $1[$2];$n" % [tmp, rope(lengthOrd(d.typ)+1)]
|
||||
|
||||
proc genObjectFields(m: BModule, typ, origType: PType, n: PNode, expr: Rope) =
|
||||
proc genObjectFields(m: BModule, typ, origType: PType, n: PNode, expr: Rope;
|
||||
info: TLineInfo) =
|
||||
case n.kind
|
||||
of nkRecList:
|
||||
var L = sonsLen(n)
|
||||
if L == 1:
|
||||
genObjectFields(m, typ, origType, n.sons[0], expr)
|
||||
genObjectFields(m, typ, origType, n.sons[0], expr, info)
|
||||
elif L > 0:
|
||||
var tmp = getTempName(m)
|
||||
addf(m.s[cfsTypeInit1], "static TNimNode* $1[$2];$n", [tmp, rope(L)])
|
||||
for i in countup(0, L-1):
|
||||
var tmp2 = getNimNode(m)
|
||||
addf(m.s[cfsTypeInit3], "$1[$2] = &$3;$n", [tmp, rope(i), tmp2])
|
||||
genObjectFields(m, typ, origType, n.sons[i], tmp2)
|
||||
genObjectFields(m, typ, origType, n.sons[i], tmp2, info)
|
||||
addf(m.s[cfsTypeInit3], "$1.len = $2; $1.kind = 2; $1.sons = &$3[0];$n",
|
||||
[expr, rope(L), tmp])
|
||||
else:
|
||||
@@ -1024,14 +1039,14 @@ proc genObjectFields(m: BModule, typ, origType: PType, n: PNode, expr: Rope) =
|
||||
"$1.offset = offsetof($2, $3);$n" & "$1.typ = $4;$n" &
|
||||
"$1.name = $5;$n" & "$1.sons = &$6[0];$n" &
|
||||
"$1.len = $7;$n", [expr, getTypeDesc(m, origType), field.loc.r,
|
||||
genTypeInfo(m, field.typ),
|
||||
genTypeInfo(m, field.typ, info),
|
||||
makeCString(field.name.s),
|
||||
tmp, rope(L)])
|
||||
addf(m.s[cfsData], "TNimNode* $1[$2];$n", [tmp, rope(L+1)])
|
||||
for i in countup(1, sonsLen(n)-1):
|
||||
var b = n.sons[i] # branch
|
||||
var tmp2 = getNimNode(m)
|
||||
genObjectFields(m, typ, origType, lastSon(b), tmp2)
|
||||
genObjectFields(m, typ, origType, lastSon(b), tmp2, info)
|
||||
case b.kind
|
||||
of nkOfBranch:
|
||||
if sonsLen(b) < 2:
|
||||
@@ -1059,15 +1074,20 @@ proc genObjectFields(m: BModule, typ, origType: PType, n: PNode, expr: Rope) =
|
||||
addf(m.s[cfsTypeInit3], "$1.kind = 1;$n" &
|
||||
"$1.offset = offsetof($2, $3);$n" & "$1.typ = $4;$n" &
|
||||
"$1.name = $5;$n", [expr, getTypeDesc(m, origType),
|
||||
field.loc.r, genTypeInfo(m, field.typ), makeCString(field.name.s)])
|
||||
field.loc.r, genTypeInfo(m, field.typ, info), makeCString(field.name.s)])
|
||||
else: internalError(n.info, "genObjectFields")
|
||||
|
||||
proc genObjectInfo(m: BModule, typ, origType: PType, name: Rope) =
|
||||
if typ.kind == tyObject: genTypeInfoAux(m, typ, origType, name)
|
||||
else: genTypeInfoAuxBase(m, typ, origType, name, rope("0"))
|
||||
proc genObjectInfo(m: BModule, typ, origType: PType, name: Rope; info: TLineInfo) =
|
||||
if typ.kind == tyObject:
|
||||
if incompleteType(typ):
|
||||
localError(info, "request for RTTI generation for incomplete object: " &
|
||||
typeToString(typ))
|
||||
genTypeInfoAux(m, typ, origType, name, info)
|
||||
else:
|
||||
genTypeInfoAuxBase(m, typ, origType, name, rope("0"), info)
|
||||
var tmp = getNimNode(m)
|
||||
if not isImportedType(typ):
|
||||
genObjectFields(m, typ, origType, typ.n, tmp)
|
||||
genObjectFields(m, typ, origType, typ.n, tmp, info)
|
||||
addf(m.s[cfsTypeInit3], "$1.node = &$2;$n", [name, tmp])
|
||||
var t = typ.sons[0]
|
||||
while t != nil:
|
||||
@@ -1075,8 +1095,8 @@ proc genObjectInfo(m: BModule, typ, origType: PType, name: Rope) =
|
||||
t.flags.incl tfObjHasKids
|
||||
t = t.sons[0]
|
||||
|
||||
proc genTupleInfo(m: BModule, typ, origType: PType, name: Rope) =
|
||||
genTypeInfoAuxBase(m, typ, typ, name, rope("0"))
|
||||
proc genTupleInfo(m: BModule, typ, origType: PType, name: Rope; info: TLineInfo) =
|
||||
genTypeInfoAuxBase(m, typ, typ, name, rope("0"), info)
|
||||
var expr = getNimNode(m)
|
||||
var length = sonsLen(typ)
|
||||
if length > 0:
|
||||
@@ -1090,7 +1110,7 @@ proc genTupleInfo(m: BModule, typ, origType: PType, name: Rope) =
|
||||
"$1.offset = offsetof($2, Field$3);$n" &
|
||||
"$1.typ = $4;$n" &
|
||||
"$1.name = \"Field$3\";$n",
|
||||
[tmp2, getTypeDesc(m, origType), rope(i), genTypeInfo(m, a)])
|
||||
[tmp2, getTypeDesc(m, origType), rope(i), genTypeInfo(m, a, info)])
|
||||
addf(m.s[cfsTypeInit3], "$1.len = $2; $1.kind = 2; $1.sons = &$3[0];$n",
|
||||
[expr, rope(length), tmp])
|
||||
else:
|
||||
@@ -1098,12 +1118,12 @@ proc genTupleInfo(m: BModule, typ, origType: PType, name: Rope) =
|
||||
[expr, rope(length)])
|
||||
addf(m.s[cfsTypeInit3], "$1.node = &$2;$n", [name, expr])
|
||||
|
||||
proc genEnumInfo(m: BModule, typ: PType, name: Rope) =
|
||||
proc genEnumInfo(m: BModule, typ: PType, name: Rope; info: TLineInfo) =
|
||||
# Type information for enumerations is quite heavy, so we do some
|
||||
# optimizations here: The ``typ`` field is never set, as it is redundant
|
||||
# anyway. We generate a cstring array and a loop over it. Exceptional
|
||||
# positions will be reset after the loop.
|
||||
genTypeInfoAux(m, typ, typ, name)
|
||||
genTypeInfoAux(m, typ, typ, name, info)
|
||||
var nodePtrs = getTempName(m)
|
||||
var length = sonsLen(typ.n)
|
||||
addf(m.s[cfsTypeInit1], "static TNimNode* $1[$2];$n",
|
||||
@@ -1141,15 +1161,15 @@ proc genEnumInfo(m: BModule, typ: PType, name: Rope) =
|
||||
# 1 << 2 is {ntfEnumHole}
|
||||
addf(m.s[cfsTypeInit3], "$1.flags = 1<<2;$n", [name])
|
||||
|
||||
proc genSetInfo(m: BModule, typ: PType, name: Rope) =
|
||||
proc genSetInfo(m: BModule, typ: PType, name: Rope; info: TLineInfo) =
|
||||
assert(typ.sons[0] != nil)
|
||||
genTypeInfoAux(m, typ, typ, name)
|
||||
genTypeInfoAux(m, typ, typ, name, info)
|
||||
var tmp = getNimNode(m)
|
||||
addf(m.s[cfsTypeInit3], "$1.len = $2; $1.kind = 0;$n" & "$3.node = &$1;$n",
|
||||
[tmp, rope(firstOrd(typ)), name])
|
||||
|
||||
proc genArrayInfo(m: BModule, typ: PType, name: Rope) =
|
||||
genTypeInfoAuxBase(m, typ, typ, name, genTypeInfo(m, typ.sons[1]))
|
||||
proc genArrayInfo(m: BModule, typ: PType, name: Rope; info: TLineInfo) =
|
||||
genTypeInfoAuxBase(m, typ, typ, name, genTypeInfo(m, typ.sons[1], info), info)
|
||||
|
||||
proc fakeClosureType(owner: PSym): PType =
|
||||
# we generate the same RTTI as for a tuple[pointer, ref tuple[]]
|
||||
@@ -1171,11 +1191,11 @@ proc genDeepCopyProc(m: BModule; s: PSym; result: Rope) =
|
||||
addf(m.s[cfsTypeInit3], "$1.deepcopy =(void* (N_RAW_NIMCALL*)(void*))$2;$n",
|
||||
[result, s.loc.r])
|
||||
|
||||
proc genTypeInfo(m: BModule, t: PType): Rope =
|
||||
proc genTypeInfo(m: BModule, t: PType; info: TLineInfo): Rope =
|
||||
let origType = t
|
||||
var t = skipTypes(origType, irrelevantForBackend + tyUserTypeClasses)
|
||||
if t.kind == tyOpt:
|
||||
return genTypeInfo(m, optLowering(t))
|
||||
return genTypeInfo(m, optLowering(t), info)
|
||||
|
||||
let sig = hashType(origType)
|
||||
result = m.typeInfoMarker.getOrDefault(sig)
|
||||
@@ -1197,7 +1217,7 @@ proc genTypeInfo(m: BModule, t: PType): Rope =
|
||||
let owner = t.skipTypes(typedescPtrs).owner.getModule
|
||||
if owner != m.module:
|
||||
# make sure the type info is created in the owner module
|
||||
discard genTypeInfo(m.g.modules[owner.position], origType)
|
||||
discard genTypeInfo(m.g.modules[owner.position], origType, info)
|
||||
# reference the type info as extern here
|
||||
discard cgsym(m, "TNimType")
|
||||
discard cgsym(m, "TNimNode")
|
||||
@@ -1208,35 +1228,35 @@ proc genTypeInfo(m: BModule, t: PType): Rope =
|
||||
case t.kind
|
||||
of tyEmpty, tyVoid: result = rope"0"
|
||||
of tyPointer, tyBool, tyChar, tyCString, tyString, tyInt..tyUInt64, tyVar:
|
||||
genTypeInfoAuxBase(m, t, t, result, rope"0")
|
||||
genTypeInfoAuxBase(m, t, t, result, rope"0", info)
|
||||
of tyStatic:
|
||||
if t.n != nil: result = genTypeInfo(m, lastSon t)
|
||||
if t.n != nil: result = genTypeInfo(m, lastSon t, info)
|
||||
else: internalError("genTypeInfo(" & $t.kind & ')')
|
||||
of tyUserTypeClasses:
|
||||
internalAssert t.isResolvedUserTypeClass
|
||||
return genTypeInfo(m, t.lastSon)
|
||||
return genTypeInfo(m, t.lastSon, info)
|
||||
of tyProc:
|
||||
if t.callConv != ccClosure:
|
||||
genTypeInfoAuxBase(m, t, t, result, rope"0")
|
||||
genTypeInfoAuxBase(m, t, t, result, rope"0", info)
|
||||
else:
|
||||
let x = fakeClosureType(t.owner)
|
||||
genTupleInfo(m, x, x, result)
|
||||
genTupleInfo(m, x, x, result, info)
|
||||
of tySequence, tyRef, tyOptAsRef:
|
||||
genTypeInfoAux(m, t, t, result)
|
||||
genTypeInfoAux(m, t, t, result, info)
|
||||
if gSelectedGC >= gcMarkAndSweep:
|
||||
let markerProc = genTraverseProc(m, origType, sig, tiNew)
|
||||
addf(m.s[cfsTypeInit3], "$1.marker = $2;$n", [result, markerProc])
|
||||
of tyPtr, tyRange: genTypeInfoAux(m, t, t, result)
|
||||
of tyArray: genArrayInfo(m, t, result)
|
||||
of tySet: genSetInfo(m, t, result)
|
||||
of tyEnum: genEnumInfo(m, t, result)
|
||||
of tyObject: genObjectInfo(m, t, origType, result)
|
||||
of tyPtr, tyRange: genTypeInfoAux(m, t, t, result, info)
|
||||
of tyArray: genArrayInfo(m, t, result, info)
|
||||
of tySet: genSetInfo(m, t, result, info)
|
||||
of tyEnum: genEnumInfo(m, t, result, info)
|
||||
of tyObject: genObjectInfo(m, t, origType, result, info)
|
||||
of tyTuple:
|
||||
# if t.n != nil: genObjectInfo(m, t, result)
|
||||
# else:
|
||||
# BUGFIX: use consistently RTTI without proper field names; otherwise
|
||||
# results are not deterministic!
|
||||
genTupleInfo(m, t, origType, result)
|
||||
genTupleInfo(m, t, origType, result, info)
|
||||
else: internalError("genTypeInfo(" & $t.kind & ')')
|
||||
if t.deepCopy != nil:
|
||||
genDeepCopyProc(m, t.deepCopy, result)
|
||||
|
||||
@@ -16,11 +16,11 @@ import
|
||||
proc getPragmaStmt*(n: PNode, w: TSpecialWord): PNode =
|
||||
case n.kind
|
||||
of nkStmtList:
|
||||
for i in 0 .. < n.len:
|
||||
for i in 0 ..< n.len:
|
||||
result = getPragmaStmt(n[i], w)
|
||||
if result != nil: break
|
||||
of nkPragma:
|
||||
for i in 0 .. < n.len:
|
||||
for i in 0 ..< n.len:
|
||||
if whichPragma(n[i]) == w: return n[i]
|
||||
else: discard
|
||||
|
||||
|
||||
@@ -271,11 +271,11 @@ proc genObjectInit(p: BProc, section: TCProcSection, t: PType, a: TLoc,
|
||||
while (s.kind == tyObject) and (s.sons[0] != nil):
|
||||
add(r, ".Sup")
|
||||
s = skipTypes(s.sons[0], skipPtrs)
|
||||
linefmt(p, section, "$1.m_type = $2;$n", r, genTypeInfo(p.module, t))
|
||||
linefmt(p, section, "$1.m_type = $2;$n", r, genTypeInfo(p.module, t, a.lode.info))
|
||||
of frEmbedded:
|
||||
# worst case for performance:
|
||||
var r = if takeAddr: addrLoc(a) else: rdLoc(a)
|
||||
linefmt(p, section, "#objectInit($1, $2);$n", r, genTypeInfo(p.module, t))
|
||||
linefmt(p, section, "#objectInit($1, $2);$n", r, genTypeInfo(p.module, t, a.lode.info))
|
||||
|
||||
type
|
||||
TAssignmentFlag = enum
|
||||
@@ -306,7 +306,7 @@ proc resetLoc(p: BProc, loc: var TLoc) =
|
||||
linefmt(p, cpsStmts, "#chckNil((void*)$1);$n", addrLoc(loc))
|
||||
if loc.storage != OnStack:
|
||||
linefmt(p, cpsStmts, "#genericReset((void*)$1, $2);$n",
|
||||
addrLoc(loc), genTypeInfo(p.module, loc.t))
|
||||
addrLoc(loc), genTypeInfo(p.module, loc.t, loc.lode.info))
|
||||
# XXX: generated reset procs should not touch the m_type
|
||||
# field, so disabling this should be safe:
|
||||
genObjectInit(p, cpsStmts, loc.t, loc, true)
|
||||
@@ -381,7 +381,7 @@ proc localDebugInfo(p: BProc, s: PSym) =
|
||||
lineF(p, cpsInit,
|
||||
"FR_.s[$1].address = (void*)$3; FR_.s[$1].typ = $4; FR_.s[$1].name = $2;$n",
|
||||
[p.maxFrameLen.rope, makeCString(normalize(s.name.s)), a,
|
||||
genTypeInfo(p.module, s.loc.t)])
|
||||
genTypeInfo(p.module, s.loc.t, s.info)])
|
||||
inc(p.maxFrameLen)
|
||||
inc p.blocks[p.blocks.len-1].frameLen
|
||||
|
||||
@@ -451,7 +451,7 @@ proc assignGlobalVar(p: BProc, n: PNode) =
|
||||
appcg(p.module, p.module.s[cfsDebugInit],
|
||||
"#dbgRegisterGlobal($1, &$2, $3);$n",
|
||||
[makeCString(normalize(s.owner.name.s & '.' & s.name.s)),
|
||||
s.loc.r, genTypeInfo(p.module, s.typ)])
|
||||
s.loc.r, genTypeInfo(p.module, s.typ, n.info)])
|
||||
|
||||
proc assignParam(p: BProc, s: PSym) =
|
||||
assert(s.loc.r != nil)
|
||||
@@ -493,7 +493,32 @@ proc initLocExprSingleUse(p: BProc, e: PNode, result: var TLoc) =
|
||||
proc lenField(p: BProc): Rope =
|
||||
result = rope(if p.module.compileToCpp: "len" else: "Sup.len")
|
||||
|
||||
include ccgcalls, "ccgstmts.nim", "ccgexprs.nim"
|
||||
include ccgcalls, "ccgstmts.nim"
|
||||
|
||||
proc initFrame(p: BProc, procname, filename: Rope): Rope =
|
||||
discard cgsym(p.module, "nimFrame")
|
||||
if p.maxFrameLen > 0:
|
||||
discard cgsym(p.module, "VarSlot")
|
||||
result = rfmt(nil, "\tnimfrs_($1, $2, $3, $4);$n",
|
||||
procname, filename, p.maxFrameLen.rope,
|
||||
p.blocks[0].frameLen.rope)
|
||||
else:
|
||||
result = rfmt(nil, "\tnimfr_($1, $2);$n", procname, filename)
|
||||
|
||||
proc initFrameNoDebug(p: BProc; frame, procname, filename: Rope; line: int): Rope =
|
||||
discard cgsym(p.module, "nimFrame")
|
||||
addf(p.blocks[0].sections[cpsLocals], "TFrame $1;$n", [frame])
|
||||
result = rfmt(nil, "\t$1.procname = $2; $1.filename = $3; " &
|
||||
" $1.line = $4; $1.len = -1; nimFrame(&$1);$n",
|
||||
frame, procname, filename, rope(line))
|
||||
|
||||
proc deinitFrameNoDebug(p: BProc; frame: Rope): Rope =
|
||||
result = rfmt(p.module, "\t#popFrameOfAddr(&$1);$n", frame)
|
||||
|
||||
proc deinitFrame(p: BProc): Rope =
|
||||
result = rfmt(p.module, "\t#popFrame();$n")
|
||||
|
||||
include ccgexprs
|
||||
|
||||
# ----------------------------- dynamic library handling -----------------
|
||||
# We don't finalize dynamic libs as the OS does this for us.
|
||||
@@ -600,7 +625,7 @@ proc symInDynamicLibPartial(m: BModule, sym: PSym) =
|
||||
sym.typ.sym = nil # generate a new name
|
||||
|
||||
proc cgsym(m: BModule, name: string): Rope =
|
||||
var sym = magicsys.getCompilerProc(name)
|
||||
let sym = magicsys.getCompilerProc(name)
|
||||
if sym != nil:
|
||||
case sym.kind
|
||||
of skProc, skFunc, skMethod, skConverter, skIterator: genProc(m, sym)
|
||||
@@ -637,19 +662,6 @@ proc generateHeaders(m: BModule) =
|
||||
add(m.s[cfsHeaders], "#undef powerpc" & tnl)
|
||||
add(m.s[cfsHeaders], "#undef unix" & tnl)
|
||||
|
||||
proc initFrame(p: BProc, procname, filename: Rope): Rope =
|
||||
discard cgsym(p.module, "nimFrame")
|
||||
if p.maxFrameLen > 0:
|
||||
discard cgsym(p.module, "VarSlot")
|
||||
result = rfmt(nil, "\tnimfrs_($1, $2, $3, $4);$n",
|
||||
procname, filename, p.maxFrameLen.rope,
|
||||
p.blocks[0].frameLen.rope)
|
||||
else:
|
||||
result = rfmt(nil, "\tnimfr_($1, $2);$n", procname, filename)
|
||||
|
||||
proc deinitFrame(p: BProc): Rope =
|
||||
result = rfmt(p.module, "\t#popFrame();$n")
|
||||
|
||||
proc closureSetup(p: BProc, prc: PSym) =
|
||||
if tfCapturesEnv notin prc.typ.flags: return
|
||||
# prc.ast[paramsPos].last contains the type we're after:
|
||||
@@ -896,14 +908,15 @@ proc addIntTypes(result: var Rope) {.inline.} =
|
||||
platform.CPU[targetCPU].intSize.rope])
|
||||
|
||||
proc getCopyright(cfile: Cfile): Rope =
|
||||
const copyrightYear = "2017"
|
||||
if optCompileOnly in gGlobalOptions:
|
||||
result = ("/* Generated by Nim Compiler v$1 */$N" &
|
||||
"/* (c) " & CompileDate.substr(0, 3) & " Andreas Rumpf */$N" &
|
||||
"/* (c) " & copyrightYear & " Andreas Rumpf */$N" &
|
||||
"/* The generated code is subject to the original license. */$N") %
|
||||
[rope(VersionAsString)]
|
||||
else:
|
||||
result = ("/* Generated by Nim Compiler v$1 */$N" &
|
||||
"/* (c) " & CompileDate.substr(0, 3) & " Andreas Rumpf */$N" &
|
||||
"/* (c) " & copyrightYear & " Andreas Rumpf */$N" &
|
||||
"/* The generated code is subject to the original license. */$N" &
|
||||
"/* Compiled for: $2, $3, $4 */$N" &
|
||||
"/* Command for C compiler:$n $5 */$N") %
|
||||
@@ -920,7 +933,7 @@ proc getFileHeader(cfile: Cfile): Rope =
|
||||
proc genFilenames(m: BModule): Rope =
|
||||
discard cgsym(m, "dbgRegisterFilename")
|
||||
result = nil
|
||||
for i in 0.. <fileInfos.len:
|
||||
for i in 0..<fileInfos.len:
|
||||
result.addf("dbgRegisterFilename($1);$N", [fileInfos[i].projPath.makeCString])
|
||||
|
||||
proc genMainProc(m: BModule) =
|
||||
@@ -1013,10 +1026,9 @@ proc genMainProc(m: BModule) =
|
||||
ComponentConstruct =
|
||||
"void Libc::Component::construct(Libc::Env &env) {$N" &
|
||||
"\tgenodeEnv = &env;$N" &
|
||||
"\tLibc::with_libc([&] () {$n\t" &
|
||||
"\tLibc::with_libc([&] () {$N\t" &
|
||||
MainProcs &
|
||||
"\t});$N" &
|
||||
"\tenv.parent().exit(0);$N" &
|
||||
"}$N$N"
|
||||
|
||||
var nimMain, otherMain: FormatStr
|
||||
@@ -1154,7 +1166,7 @@ proc genInitCode(m: BModule) =
|
||||
|
||||
for i, el in pairs(m.extensionLoaders):
|
||||
if el != nil:
|
||||
let ex = "N_NIMCALL(void, nimLoadProcs$1)(void) {$2}$N$N" %
|
||||
let ex = "NIM_EXTERNC N_NIMCALL(void, nimLoadProcs$1)(void) {$2}$N$N" %
|
||||
[(i.ord - '0'.ord).rope, el]
|
||||
add(m.s[cfsInitProc], ex)
|
||||
|
||||
@@ -1246,7 +1258,7 @@ proc resetModule*(m: BModule) =
|
||||
|
||||
# indicate that this is now cached module
|
||||
# the cache will be invalidated by nullifying gModules
|
||||
m.fromCache = true
|
||||
#m.fromCache = true
|
||||
m.g = nil
|
||||
|
||||
# we keep only the "merge info" information for the module
|
||||
@@ -1293,7 +1305,7 @@ proc myOpen(graph: ModuleGraph; module: PSym; cache: IdentCache): PPassContext =
|
||||
|
||||
proc writeHeader(m: BModule) =
|
||||
var result = ("/* Generated by Nim Compiler v$1 */$N" &
|
||||
"/* (c) " & CompileDate.substr(0, 3) & " Andreas Rumpf */$N" &
|
||||
"/* (c) 2017 Andreas Rumpf */$N" &
|
||||
"/* The generated code is subject to the original license. */$N") %
|
||||
[rope(VersionAsString)]
|
||||
|
||||
@@ -1324,7 +1336,6 @@ proc getCFile(m: BModule): string =
|
||||
|
||||
proc myOpenCached(graph: ModuleGraph; module: PSym, rd: PRodReader): PPassContext =
|
||||
injectG(graph.config)
|
||||
assert optSymbolFiles in gGlobalOptions
|
||||
var m = newModule(g, module)
|
||||
readMergeInfo(getCFile(m), m)
|
||||
result = m
|
||||
@@ -1378,7 +1389,7 @@ proc writeModule(m: BModule, pending: bool) =
|
||||
# generate code for the init statements of the module:
|
||||
let cfile = getCFile(m)
|
||||
|
||||
if not m.fromCache or optForceFullMake in gGlobalOptions:
|
||||
if m.rd == nil or optForceFullMake in gGlobalOptions:
|
||||
genInitCode(m)
|
||||
finishTypeDescriptions(m)
|
||||
if sfMainModule in m.module.flags:
|
||||
@@ -1431,6 +1442,10 @@ proc myClose(graph: ModuleGraph; b: PPassContext, n: PNode): PNode =
|
||||
result = n
|
||||
if b == nil or passes.skipCodegen(n): return
|
||||
var m = BModule(b)
|
||||
# if the module is cached, we don't regenerate the main proc
|
||||
# nor the dispatchers? But if the dispatchers changed?
|
||||
# XXX emit the dispatchers into its own .c file?
|
||||
if b.rd != nil: return
|
||||
if n != nil:
|
||||
m.initProc.options = initProcOptions(m)
|
||||
genStmts(m.initProc, n)
|
||||
@@ -1453,10 +1468,10 @@ proc cgenWriteModules*(backend: RootRef, config: ConfigRef) =
|
||||
if g.generatedHeader != nil: finishModule(g.generatedHeader)
|
||||
while g.forwardedProcsCounter > 0:
|
||||
for m in cgenModules(g):
|
||||
if not m.fromCache:
|
||||
if m.rd == nil:
|
||||
finishModule(m)
|
||||
for m in cgenModules(g):
|
||||
if m.fromCache:
|
||||
if m.rd != nil:
|
||||
m.updateCachedModule
|
||||
else:
|
||||
m.writeModule(pending=true)
|
||||
|
||||
@@ -54,7 +54,7 @@ type
|
||||
TCProcSections* = array[TCProcSection, Rope] # represents a generated C proc
|
||||
BModule* = ref TCGen
|
||||
BProc* = ref TCProc
|
||||
TBlock*{.final.} = object
|
||||
TBlock* = object
|
||||
id*: int # the ID of the label; positive means that it
|
||||
label*: Rope # generated text for the label
|
||||
# nil if label is not used
|
||||
@@ -64,7 +64,7 @@ type
|
||||
nestedExceptStmts*: int16 # how many except statements is it nested into
|
||||
frameLen*: int16
|
||||
|
||||
TCProc{.final.} = object # represents C proc that is currently generated
|
||||
TCProc = object # represents C proc that is currently generated
|
||||
prc*: PSym # the Nim proc that this C proc belongs to
|
||||
beforeRetNeeded*: bool # true iff 'BeforeRet' label for proc is needed
|
||||
threadVarAccessed*: bool # true if the proc already accessed some threadvar
|
||||
@@ -154,7 +154,7 @@ proc includeHeader*(this: BModule; header: string) =
|
||||
|
||||
proc s*(p: BProc, s: TCProcSection): var Rope {.inline.} =
|
||||
# section in the current block
|
||||
result = p.blocks[^1].sections[s]
|
||||
result = p.blocks[p.blocks.len-1].sections[s]
|
||||
|
||||
proc procSec*(p: BProc, s: TCProcSection): var Rope {.inline.} =
|
||||
# top level proc sections
|
||||
|
||||
@@ -53,8 +53,8 @@ proc processSwitch*(switch, arg: string, pass: TCmdLinePass, info: TLineInfo;
|
||||
# implementation
|
||||
|
||||
const
|
||||
HelpMessage = "Nim Compiler Version $1 (" & CompileDate & ") [$2: $3]\n" &
|
||||
"Copyright (c) 2006-" & CompileDate.substr(0, 3) & " by Andreas Rumpf\n"
|
||||
HelpMessage = "Nim Compiler Version $1 [$2: $3]\n" &
|
||||
"Copyright (c) 2006-2017 by Andreas Rumpf\n"
|
||||
|
||||
const
|
||||
Usage = slurp"../doc/basicopt.txt".replace("//", "")
|
||||
@@ -261,7 +261,7 @@ proc testCompileOption*(switch: string, info: TLineInfo): bool =
|
||||
of "assertions", "a": result = contains(gOptions, optAssert)
|
||||
of "deadcodeelim": result = contains(gGlobalOptions, optDeadCodeElim)
|
||||
of "run", "r": result = contains(gGlobalOptions, optRun)
|
||||
of "symbolfiles": result = contains(gGlobalOptions, optSymbolFiles)
|
||||
of "symbolfiles": result = gSymbolFiles != disabledSf
|
||||
of "genscript": result = contains(gGlobalOptions, optGenScript)
|
||||
of "threads": result = contains(gGlobalOptions, optThreads)
|
||||
of "taintmode": result = contains(gGlobalOptions, optTaintMode)
|
||||
@@ -343,7 +343,9 @@ proc processSwitch(switch, arg: string, pass: TCmdLinePass, info: TLineInfo;
|
||||
# keep the old name for compat
|
||||
if pass in {passCmd2, passPP} and not options.gNoNimblePath:
|
||||
expectArg(switch, arg, pass, info)
|
||||
let path = processPath(arg, info, notRelativeToProj=true)
|
||||
var path = processPath(arg, info, notRelativeToProj=true)
|
||||
let nimbleDir = getEnv("NIMBLE_DIR")
|
||||
if nimbleDir.len > 0 and pass == passPP: path = nimbleDir / "pkgs"
|
||||
nimblePath(path, info)
|
||||
of "nonimblepath", "nobabelpath":
|
||||
expectNoArg(switch, arg, pass, info)
|
||||
@@ -596,7 +598,12 @@ proc processSwitch(switch, arg: string, pass: TCmdLinePass, info: TLineInfo;
|
||||
expectNoArg(switch, arg, pass, info)
|
||||
helpOnError(pass)
|
||||
of "symbolfiles":
|
||||
processOnOffSwitchG({optSymbolFiles}, arg, pass, info)
|
||||
case arg.normalize
|
||||
of "on": gSymbolFiles = enabledSf
|
||||
of "off": gSymbolFiles = disabledSf
|
||||
of "writeonly": gSymbolFiles = writeOnlySf
|
||||
of "readonly": gSymbolFiles = readOnlySf
|
||||
else: localError(info, errOnOrOffExpectedButXFound, arg)
|
||||
of "skipcfg":
|
||||
expectNoArg(switch, arg, pass, info)
|
||||
incl(gGlobalOptions, optSkipConfigFile)
|
||||
@@ -609,7 +616,7 @@ proc processSwitch(switch, arg: string, pass: TCmdLinePass, info: TLineInfo;
|
||||
of "skipparentcfg":
|
||||
expectNoArg(switch, arg, pass, info)
|
||||
incl(gGlobalOptions, optSkipParentConfigFiles)
|
||||
of "genscript":
|
||||
of "genscript", "gendeps":
|
||||
expectNoArg(switch, arg, pass, info)
|
||||
incl(gGlobalOptions, optGenScript)
|
||||
of "colors": processOnOffSwitchG({optUseColors}, arg, pass, info)
|
||||
@@ -652,6 +659,9 @@ proc processSwitch(switch, arg: string, pass: TCmdLinePass, info: TLineInfo;
|
||||
gListFullPaths = true
|
||||
of "dynliboverride":
|
||||
dynlibOverride(switch, arg, pass, info)
|
||||
of "dynliboverrideall":
|
||||
expectNoArg(switch, arg, pass, info)
|
||||
gDynlibOverrideAll = true
|
||||
of "cs":
|
||||
# only supported for compatibility. Does nothing.
|
||||
expectArg(switch, arg, pass, info)
|
||||
|
||||
@@ -109,3 +109,6 @@ proc initDefines*() =
|
||||
defineSymbol("nimGenericInOutFlags")
|
||||
when false: defineSymbol("nimHasOpt")
|
||||
defineSymbol("nimNoArrayToCstringConversion")
|
||||
defineSymbol("nimNewRoof")
|
||||
defineSymbol("nimHasRunnableExamples")
|
||||
defineSymbol("nimNewDot")
|
||||
|
||||
@@ -10,7 +10,7 @@
|
||||
# This module implements a dependency file generator.
|
||||
|
||||
import
|
||||
os, options, ast, astalgo, msgs, ropes, idents, passes, importer
|
||||
os, options, ast, astalgo, msgs, ropes, idents, passes, modulepaths
|
||||
|
||||
from modulegraphs import ModuleGraph
|
||||
|
||||
|
||||
@@ -93,9 +93,7 @@
|
||||
|
||||
import
|
||||
intsets, ast, astalgo, msgs, renderer, magicsys, types, idents, trees,
|
||||
strutils, options, dfa, lowerings
|
||||
|
||||
template hasDestructor(t: PType): bool = tfHasAsgn in t.flags
|
||||
strutils, options, dfa, lowerings, rodread
|
||||
|
||||
const
|
||||
InterestingSyms = {skVar, skResult, skLet}
|
||||
@@ -166,18 +164,50 @@ proc isHarmlessVar*(s: PSym; c: Con): bool =
|
||||
template interestingSym(s: PSym): bool =
|
||||
s.owner == c.owner and s.kind in InterestingSyms and hasDestructor(s.typ)
|
||||
|
||||
proc genSink(t: PType; dest: PNode): PNode =
|
||||
let op = if t.sink != nil: t.sink else: t.assignment
|
||||
assert op != nil
|
||||
proc patchHead(n: PNode) =
|
||||
if n.kind in nkCallKinds and n[0].kind == nkSym and n.len > 1:
|
||||
let s = n[0].sym
|
||||
if s.name.s[0] == '=' and s.name.s in ["=sink", "=", "=destroy"]:
|
||||
if sfFromGeneric in s.flags:
|
||||
excl(s.flags, sfFromGeneric)
|
||||
patchHead(s.getBody)
|
||||
if n[1].typ.isNil:
|
||||
# XXX toptree crashes without this workaround. Figure out why.
|
||||
return
|
||||
let t = n[1].typ.skipTypes({tyVar, tyGenericInst, tyAlias, tyInferred})
|
||||
template patch(op, field) =
|
||||
if s.name.s == op and field != nil and field != s:
|
||||
n.sons[0].sym = field
|
||||
patch "=sink", t.sink
|
||||
patch "=", t.assignment
|
||||
patch "=destroy", t.destructor
|
||||
for x in n:
|
||||
patchHead(x)
|
||||
|
||||
proc patchHead(s: PSym) =
|
||||
if sfFromGeneric in s.flags:
|
||||
patchHead(s.ast[bodyPos])
|
||||
|
||||
template genOp(opr, opname) =
|
||||
let op = opr
|
||||
if op == nil:
|
||||
globalError(dest.info, "internal error: '" & opname & "' operator not found for type " & typeToString(t))
|
||||
elif op.ast[genericParamsPos].kind != nkEmpty:
|
||||
globalError(dest.info, "internal error: '" & opname & "' operator is generic")
|
||||
patchHead op
|
||||
result = newTree(nkCall, newSymNode(op), newTree(nkHiddenAddr, dest))
|
||||
|
||||
proc genSink(t: PType; dest: PNode): PNode =
|
||||
let t = t.skipTypes({tyGenericInst, tyAlias})
|
||||
genOp(if t.sink != nil: t.sink else: t.assignment, "=sink")
|
||||
|
||||
proc genCopy(t: PType; dest: PNode): PNode =
|
||||
assert t.assignment != nil
|
||||
result = newTree(nkCall, newSymNode(t.assignment), newTree(nkHiddenAddr, dest))
|
||||
let t = t.skipTypes({tyGenericInst, tyAlias})
|
||||
genOp(t.assignment, "=")
|
||||
|
||||
proc genDestroy(t: PType; dest: PNode): PNode =
|
||||
assert t.destructor != nil
|
||||
result = newTree(nkCall, newSymNode(t.destructor), newTree(nkHiddenAddr, dest))
|
||||
let t = t.skipTypes({tyGenericInst, tyAlias})
|
||||
genOp(t.destructor, "=destroy")
|
||||
|
||||
proc addTopVar(c: var Con; v: PNode) =
|
||||
c.topLevelVars.add newTree(nkIdentDefs, v, emptyNode, emptyNode)
|
||||
@@ -189,7 +219,7 @@ template recurse(n, dest) =
|
||||
dest.add p(n[i], c)
|
||||
|
||||
proc moveOrCopy(dest, ri: PNode; c: var Con): PNode =
|
||||
if ri.kind in nkCallKinds:
|
||||
if ri.kind in nkCallKinds+{nkObjConstr}:
|
||||
result = genSink(ri.typ, dest)
|
||||
# watch out and no not transform 'ri' twice if it's a call:
|
||||
let ri2 = copyNode(ri)
|
||||
@@ -253,7 +283,7 @@ proc p(n: PNode; c: var Con): PNode =
|
||||
result = copyNode(n)
|
||||
recurse(n, result)
|
||||
of nkAsgn, nkFastAsgn:
|
||||
if n[0].kind == nkSym and interestingSym(n[0].sym):
|
||||
if hasDestructor(n[0].typ):
|
||||
result = moveOrCopy(n[0], n[1], c)
|
||||
else:
|
||||
result = copyNode(n)
|
||||
@@ -266,6 +296,8 @@ proc p(n: PNode; c: var Con): PNode =
|
||||
recurse(n, result)
|
||||
|
||||
proc injectDestructorCalls*(owner: PSym; n: PNode): PNode =
|
||||
when defined(nimDebugDestroys):
|
||||
echo "injecting into ", n
|
||||
var c: Con
|
||||
c.owner = owner
|
||||
c.tmp = newSym(skTemp, getIdent":d", owner, n.info)
|
||||
@@ -291,6 +323,7 @@ proc injectDestructorCalls*(owner: PSym; n: PNode): PNode =
|
||||
result.add body
|
||||
|
||||
when defined(nimDebugDestroys):
|
||||
echo "------------------------------------"
|
||||
echo owner.name.s, " transformed to: "
|
||||
echo result
|
||||
if owner.name.s == "main" or true:
|
||||
echo "------------------------------------"
|
||||
echo owner.name.s, " transformed to: "
|
||||
echo result
|
||||
|
||||
185
compiler/dfa.nim
185
compiler/dfa.nim
@@ -132,7 +132,7 @@ proc gen(c: var Con; n: PNode) # {.noSideEffect.}
|
||||
proc genWhile(c: var Con; n: PNode) =
|
||||
# L1:
|
||||
# cond, tmp
|
||||
# fjmp tmp, L2
|
||||
# fork tmp, L2
|
||||
# body
|
||||
# jmp L1
|
||||
# L2:
|
||||
@@ -168,15 +168,13 @@ proc genIf(c: var Con, n: PNode) =
|
||||
var endings: seq[TPosition] = @[]
|
||||
for i in countup(0, len(n) - 1):
|
||||
var it = n.sons[i]
|
||||
c.gen(it.sons[0])
|
||||
if it.len == 2:
|
||||
c.gen(it.sons[0].sons[1])
|
||||
var elsePos = c.forkI(it.sons[0].sons[1])
|
||||
let elsePos = c.forkI(it.sons[1])
|
||||
c.gen(it.sons[1])
|
||||
if i < sonsLen(n)-1:
|
||||
endings.add(c.gotoI(it.sons[1]))
|
||||
c.patch(elsePos)
|
||||
else:
|
||||
c.gen(it.sons[0])
|
||||
for endPos in endings: c.patch(endPos)
|
||||
|
||||
proc genAndOr(c: var Con; n: PNode) =
|
||||
@@ -202,7 +200,7 @@ proc genCase(c: var Con; n: PNode) =
|
||||
# Lend:
|
||||
var endings: seq[TPosition] = @[]
|
||||
c.gen(n.sons[0])
|
||||
for i in 1 .. <n.len:
|
||||
for i in 1 ..< n.len:
|
||||
let it = n.sons[i]
|
||||
if it.len == 1:
|
||||
c.gen(it.sons[0])
|
||||
@@ -219,7 +217,7 @@ proc genTry(c: var Con; n: PNode) =
|
||||
let elsePos = c.forkI(n)
|
||||
c.gen(n.sons[0])
|
||||
c.patch(elsePos)
|
||||
for i in 1 .. <n.len:
|
||||
for i in 1 ..< n.len:
|
||||
let it = n.sons[i]
|
||||
if it.kind != nkFinally:
|
||||
var blen = len(it)
|
||||
@@ -337,100 +335,107 @@ proc gen(c: var Con; n: PNode) =
|
||||
else: discard
|
||||
|
||||
proc dfa(code: seq[Instr]) =
|
||||
# We aggressively push 'undef' values for every 'use v' instruction
|
||||
# until they are eliminated via a 'def v' instructions.
|
||||
# If we manage to push one 'undef' to a 'use' instruction, we produce
|
||||
# an error:
|
||||
var undef = initIntSet()
|
||||
var u = newSeq[IntSet](code.len) # usages
|
||||
var d = newSeq[IntSet](code.len) # defs
|
||||
var c = newSeq[IntSet](code.len) # consumed
|
||||
var backrefs = initTable[int, int]()
|
||||
for i in 0..<code.len:
|
||||
if code[i].kind == use: undef.incl(code[i].sym.id)
|
||||
u[i] = initIntSet()
|
||||
d[i] = initIntSet()
|
||||
c[i] = initIntSet()
|
||||
case code[i].kind
|
||||
of use, useWithinCall: u[i].incl(code[i].sym.id)
|
||||
of def: d[i].incl(code[i].sym.id)
|
||||
of fork, goto:
|
||||
let d = i+code[i].dest
|
||||
backrefs.add(d, i)
|
||||
|
||||
var s = newSeq[IntSet](code.len)
|
||||
for i in 0..<code.len:
|
||||
assign(s[i], undef)
|
||||
|
||||
# In the original paper, W := {0,...,n} is done. This is wasteful, we
|
||||
# have no intention to analyse a program like
|
||||
#
|
||||
# return 3
|
||||
# echo a + b
|
||||
#
|
||||
# any further than necessary.
|
||||
var w = @[0]
|
||||
while w.len > 0:
|
||||
var pc = w[^1]
|
||||
var maxIters = 50
|
||||
var someChange = true
|
||||
var takenGotos = initIntSet()
|
||||
var consuming = -1
|
||||
while w.len > 0 and maxIters > 0: # and someChange:
|
||||
dec maxIters
|
||||
var pc = w.pop() # w[^1]
|
||||
var prevPc = -1
|
||||
# this simulates a single linear control flow execution:
|
||||
while true:
|
||||
# according to the paper, it is better to shrink the working set here
|
||||
# in this inner loop:
|
||||
let widx = w.find(pc)
|
||||
if widx >= 0: w.del(widx)
|
||||
while pc < code.len:
|
||||
if prevPc >= 0:
|
||||
someChange = false
|
||||
# merge step and test for changes (we compute the fixpoints here):
|
||||
# 'u' needs to be the union of prevPc, pc
|
||||
# 'd' needs to be the intersection of 'pc'
|
||||
for id in u[prevPc]:
|
||||
if not u[pc].containsOrIncl(id):
|
||||
someChange = true
|
||||
# in (a; b) if ``a`` sets ``v`` so does ``b``. The intersection
|
||||
# is only interesting on merge points:
|
||||
for id in d[prevPc]:
|
||||
if not d[pc].containsOrIncl(id):
|
||||
someChange = true
|
||||
# if this is a merge point, we take the intersection of the 'd' sets:
|
||||
if backrefs.hasKey(pc):
|
||||
var intersect = initIntSet()
|
||||
assign(intersect, d[pc])
|
||||
var first = true
|
||||
for prevPc in backrefs.allValues(pc):
|
||||
for def in d[pc]:
|
||||
if def notin d[prevPc]:
|
||||
excl(intersect, def)
|
||||
someChange = true
|
||||
when defined(debugDfa):
|
||||
echo "Excluding ", pc, " prev ", prevPc
|
||||
assign d[pc], intersect
|
||||
if consuming >= 0:
|
||||
if not c[pc].containsOrIncl(consuming):
|
||||
someChange = true
|
||||
consuming = -1
|
||||
|
||||
# our interpretation ![I!]:
|
||||
var sid = -1
|
||||
prevPc = pc
|
||||
case code[pc].kind
|
||||
of goto, fork: discard
|
||||
of use, useWithinCall:
|
||||
let sym = code[pc].sym
|
||||
if s[pc].contains(sym.id):
|
||||
localError(code[pc].n.info, "variable read before initialized: " & sym.name.s)
|
||||
of def:
|
||||
sid = code[pc].sym.id
|
||||
|
||||
var pc2: int
|
||||
if code[pc].kind == goto:
|
||||
pc2 = pc + code[pc].dest
|
||||
else:
|
||||
pc2 = pc + 1
|
||||
if code[pc].kind == fork:
|
||||
let l = pc + code[pc].dest
|
||||
if sid >= 0 and s[l].missingOrExcl(sid):
|
||||
w.add l
|
||||
|
||||
if sid >= 0 and s[pc2].missingOrExcl(sid):
|
||||
pc = pc2
|
||||
else:
|
||||
break
|
||||
if pc >= code.len: break
|
||||
|
||||
when false:
|
||||
case code[pc].kind
|
||||
of use:
|
||||
let s = code[pc].sym
|
||||
if undefB.contains(s.id):
|
||||
localError(code[pc].n.info, "variable read before initialized: " & s.name.s)
|
||||
break
|
||||
inc pc
|
||||
of def:
|
||||
let s = code[pc].sym
|
||||
# exclude 'undef' for s for this path through the graph.
|
||||
if not undefB.missingOrExcl(s.id):
|
||||
inc pc
|
||||
else:
|
||||
break
|
||||
#undefB.excl s.id
|
||||
#inc pc
|
||||
when false:
|
||||
let prev = bindings.getOrDefault(s.id)
|
||||
if prev != value:
|
||||
# well now it has a value and we made progress, so
|
||||
bindings[s.id] = value
|
||||
inc pc
|
||||
else:
|
||||
break
|
||||
of fork:
|
||||
let diff = code[pc].dest
|
||||
# we follow pc + 1 and remember the label for later:
|
||||
w.add pc+diff
|
||||
inc pc
|
||||
of goto:
|
||||
let diff = code[pc].dest
|
||||
pc = pc + diff
|
||||
if pc >= code.len: break
|
||||
# we must leave endless loops eventually:
|
||||
if not takenGotos.containsOrIncl(pc) or someChange:
|
||||
pc = pc + code[pc].dest
|
||||
else:
|
||||
inc pc
|
||||
of fork:
|
||||
# we follow the next instruction but push the dest onto our "work" stack:
|
||||
#if someChange:
|
||||
w.add pc + code[pc].dest
|
||||
inc pc
|
||||
of use, useWithinCall:
|
||||
#if not d[prevPc].missingOrExcl():
|
||||
# someChange = true
|
||||
consuming = code[pc].sym.id
|
||||
inc pc
|
||||
of def:
|
||||
if not d[pc].containsOrIncl(code[pc].sym.id):
|
||||
someChange = true
|
||||
inc pc
|
||||
|
||||
when defined(useDfa) and defined(debugDfa):
|
||||
for i in 0..<code.len:
|
||||
echo "PC ", i, ": defs: ", d[i], "; uses ", u[i], "; consumes ", c[i]
|
||||
|
||||
# now check the condition we're interested in:
|
||||
for i in 0..<code.len:
|
||||
case code[i].kind
|
||||
of use, useWithinCall:
|
||||
let s = code[i].sym
|
||||
if s.id notin d[i]:
|
||||
localError(code[i].n.info, "usage of uninitialized variable: " & s.name.s)
|
||||
if s.id in c[i]:
|
||||
localError(code[i].n.info, "usage of an already consumed variable: " & s.name.s)
|
||||
|
||||
else: discard
|
||||
|
||||
proc dataflowAnalysis*(s: PSym; body: PNode) =
|
||||
var c = Con(code: @[], blocks: @[])
|
||||
gen(c, body)
|
||||
#echoCfg(c.code)
|
||||
when defined(useDfa) and defined(debugDfa): echoCfg(c.code)
|
||||
dfa(c.code)
|
||||
|
||||
proc constructCfg*(s: PSym; body: PNode): ControlFlowGraph =
|
||||
|
||||
@@ -15,14 +15,13 @@ import
|
||||
ast, strutils, strtabs, options, msgs, os, ropes, idents,
|
||||
wordrecg, syntaxes, renderer, lexer, packages/docutils/rstast,
|
||||
packages/docutils/rst, packages/docutils/rstgen, times,
|
||||
packages/docutils/highlite, importer, sempass2, json, xmltree, cgi,
|
||||
typesrenderer, astalgo
|
||||
packages/docutils/highlite, sempass2, json, xmltree, cgi,
|
||||
typesrenderer, astalgo, modulepaths
|
||||
|
||||
type
|
||||
TSections = array[TSymKind, Rope]
|
||||
TDocumentor = object of rstgen.RstGenerator
|
||||
modDesc: Rope # module description
|
||||
id: int # for generating IDs
|
||||
toc, section: TSections
|
||||
indexValFilename: string
|
||||
analytics: string # Google Analytics javascript, "" if doesn't exist
|
||||
@@ -109,6 +108,8 @@ proc newDocumentor*(filename: string, config: StringTableRef): PDoc =
|
||||
result.id = 100
|
||||
result.jArray = newJArray()
|
||||
initStrTable result.types
|
||||
result.onTestSnippet = proc (d: var RstGenerator; filename, cmd: string; status: int; content: string) =
|
||||
localError(newLineInfo(d.filename, -1, -1), warnUser, "only 'rst2html' supports the ':test:' attribute")
|
||||
|
||||
proc dispA(dest: var Rope, xml, tex: string, args: openArray[Rope]) =
|
||||
if gCmd != cmdRst2tex: addf(dest, xml, args)
|
||||
@@ -204,10 +205,87 @@ proc getPlainDocstring(n: PNode): string =
|
||||
if n.comment != nil and startsWith(n.comment, "##"):
|
||||
result = n.comment
|
||||
if result.len < 1:
|
||||
if n.kind notin {nkEmpty..nkNilLit}:
|
||||
for i in countup(0, len(n)-1):
|
||||
result = getPlainDocstring(n.sons[i])
|
||||
if result.len > 0: return
|
||||
for i in countup(0, safeLen(n)-1):
|
||||
result = getPlainDocstring(n.sons[i])
|
||||
if result.len > 0: return
|
||||
|
||||
proc nodeToHighlightedHtml(d: PDoc; n: PNode; result: var Rope; renderFlags: TRenderFlags = {}) =
|
||||
var r: TSrcGen
|
||||
var literal = ""
|
||||
initTokRender(r, n, renderFlags)
|
||||
var kind = tkEof
|
||||
while true:
|
||||
getNextTok(r, kind, literal)
|
||||
case kind
|
||||
of tkEof:
|
||||
break
|
||||
of tkComment:
|
||||
dispA(result, "<span class=\"Comment\">$1</span>", "\\spanComment{$1}",
|
||||
[rope(esc(d.target, literal))])
|
||||
of tokKeywordLow..tokKeywordHigh:
|
||||
dispA(result, "<span class=\"Keyword\">$1</span>", "\\spanKeyword{$1}",
|
||||
[rope(literal)])
|
||||
of tkOpr:
|
||||
dispA(result, "<span class=\"Operator\">$1</span>", "\\spanOperator{$1}",
|
||||
[rope(esc(d.target, literal))])
|
||||
of tkStrLit..tkTripleStrLit:
|
||||
dispA(result, "<span class=\"StringLit\">$1</span>",
|
||||
"\\spanStringLit{$1}", [rope(esc(d.target, literal))])
|
||||
of tkCharLit:
|
||||
dispA(result, "<span class=\"CharLit\">$1</span>", "\\spanCharLit{$1}",
|
||||
[rope(esc(d.target, literal))])
|
||||
of tkIntLit..tkUInt64Lit:
|
||||
dispA(result, "<span class=\"DecNumber\">$1</span>",
|
||||
"\\spanDecNumber{$1}", [rope(esc(d.target, literal))])
|
||||
of tkFloatLit..tkFloat128Lit:
|
||||
dispA(result, "<span class=\"FloatNumber\">$1</span>",
|
||||
"\\spanFloatNumber{$1}", [rope(esc(d.target, literal))])
|
||||
of tkSymbol:
|
||||
dispA(result, "<span class=\"Identifier\">$1</span>",
|
||||
"\\spanIdentifier{$1}", [rope(esc(d.target, literal))])
|
||||
of tkSpaces, tkInvalid:
|
||||
add(result, literal)
|
||||
of tkCurlyDotLe:
|
||||
dispA(result, """<span class="Other pragmabegin">$1</span><div class="pragma">""",
|
||||
"\\spanOther{$1}",
|
||||
[rope(esc(d.target, literal))])
|
||||
of tkCurlyDotRi:
|
||||
dispA(result, "</div><span class=\"Other pragmaend\">$1</span>",
|
||||
"\\spanOther{$1}",
|
||||
[rope(esc(d.target, literal))])
|
||||
of tkParLe, tkParRi, tkBracketLe, tkBracketRi, tkCurlyLe, tkCurlyRi,
|
||||
tkBracketDotLe, tkBracketDotRi, tkParDotLe,
|
||||
tkParDotRi, tkComma, tkSemiColon, tkColon, tkEquals, tkDot, tkDotDot,
|
||||
tkAccent, tkColonColon,
|
||||
tkGStrLit, tkGTripleStrLit, tkInfixOpr, tkPrefixOpr, tkPostfixOpr:
|
||||
dispA(result, "<span class=\"Other\">$1</span>", "\\spanOther{$1}",
|
||||
[rope(esc(d.target, literal))])
|
||||
|
||||
proc getAllRunnableExamples(d: PDoc; n: PNode; dest: var Rope) =
|
||||
case n.kind
|
||||
of nkCallKinds:
|
||||
if n[0].kind == nkSym and n[0].sym.magic == mRunnableExamples and
|
||||
n.len >= 2 and n.lastSon.kind == nkStmtList:
|
||||
dispA(dest, "\n<strong class=\"examples_text\">$1</strong>\n",
|
||||
"\n\\textbf{$1}\n", [rope"Examples:"])
|
||||
inc d.listingCounter
|
||||
let id = $d.listingCounter
|
||||
dest.add(d.config.getOrDefault"doc.listing_start" % [id, "langNim"])
|
||||
# this is a rather hacky way to get rid of the initial indentation
|
||||
# that the renderer currently produces:
|
||||
var i = 0
|
||||
var body = n.lastSon
|
||||
if body.len == 1 and body.kind == nkStmtList and
|
||||
body.lastSon.kind == nkStmtList:
|
||||
body = body.lastSon
|
||||
for b in body:
|
||||
if i > 0: dest.add "\n"
|
||||
inc i
|
||||
nodeToHighlightedHtml(d, b, dest, {})
|
||||
dest.add(d.config.getOrDefault"doc.listing_end" % id)
|
||||
else: discard
|
||||
for i in 0 ..< n.safeLen:
|
||||
getAllRunnableExamples(d, n[i], dest)
|
||||
|
||||
when false:
|
||||
proc findDocComment(n: PNode): PNode =
|
||||
@@ -252,7 +330,7 @@ proc getName(d: PDoc, n: PNode, splitAfter = -1): string =
|
||||
of nkIdent: result = esc(d.target, n.ident.s, splitAfter)
|
||||
of nkAccQuoted:
|
||||
result = esc(d.target, "`")
|
||||
for i in 0.. <n.len: result.add(getName(d, n[i], splitAfter))
|
||||
for i in 0..<n.len: result.add(getName(d, n[i], splitAfter))
|
||||
result.add esc(d.target, "`")
|
||||
of nkOpenSymChoice, nkClosedSymChoice:
|
||||
result = getName(d, n[0], splitAfter)
|
||||
@@ -268,7 +346,7 @@ proc getNameIdent(n: PNode): PIdent =
|
||||
of nkIdent: result = n.ident
|
||||
of nkAccQuoted:
|
||||
var r = ""
|
||||
for i in 0.. <n.len: r.add(getNameIdent(n[i]).s)
|
||||
for i in 0..<n.len: r.add(getNameIdent(n[i]).s)
|
||||
result = getIdent(r)
|
||||
of nkOpenSymChoice, nkClosedSymChoice:
|
||||
result = getNameIdent(n[0])
|
||||
@@ -283,7 +361,7 @@ proc getRstName(n: PNode): PRstNode =
|
||||
of nkIdent: result = newRstNode(rnLeaf, n.ident.s)
|
||||
of nkAccQuoted:
|
||||
result = getRstName(n.sons[0])
|
||||
for i in 1 .. <n.len: result.text.add(getRstName(n[i]).text)
|
||||
for i in 1 ..< n.len: result.text.add(getRstName(n[i]).text)
|
||||
of nkOpenSymChoice, nkClosedSymChoice:
|
||||
result = getRstName(n[0])
|
||||
else:
|
||||
@@ -379,11 +457,12 @@ proc genItem(d: PDoc, n, nameNode: PNode, k: TSymKind) =
|
||||
let
|
||||
name = getName(d, nameNode)
|
||||
nameRope = name.rope
|
||||
plainDocstring = getPlainDocstring(n) # call here before genRecComment!
|
||||
var plainDocstring = getPlainDocstring(n) # call here before genRecComment!
|
||||
var result: Rope = nil
|
||||
var literal, plainName = ""
|
||||
var kind = tkEof
|
||||
var comm = genRecComment(d, n) # call this here for the side-effect!
|
||||
getAllRunnableExamples(d, n, comm)
|
||||
var r: TSrcGen
|
||||
# Obtain the plain rendered string for hyperlink titles.
|
||||
initTokRender(r, n, {renderNoBody, renderNoComments, renderDocComments,
|
||||
@@ -395,53 +474,7 @@ proc genItem(d: PDoc, n, nameNode: PNode, k: TSymKind) =
|
||||
plainName.add(literal)
|
||||
|
||||
# Render the HTML hyperlink.
|
||||
initTokRender(r, n, {renderNoBody, renderNoComments, renderDocComments})
|
||||
while true:
|
||||
getNextTok(r, kind, literal)
|
||||
case kind
|
||||
of tkEof:
|
||||
break
|
||||
of tkComment:
|
||||
dispA(result, "<span class=\"Comment\">$1</span>", "\\spanComment{$1}",
|
||||
[rope(esc(d.target, literal))])
|
||||
of tokKeywordLow..tokKeywordHigh:
|
||||
dispA(result, "<span class=\"Keyword\">$1</span>", "\\spanKeyword{$1}",
|
||||
[rope(literal)])
|
||||
of tkOpr:
|
||||
dispA(result, "<span class=\"Operator\">$1</span>", "\\spanOperator{$1}",
|
||||
[rope(esc(d.target, literal))])
|
||||
of tkStrLit..tkTripleStrLit:
|
||||
dispA(result, "<span class=\"StringLit\">$1</span>",
|
||||
"\\spanStringLit{$1}", [rope(esc(d.target, literal))])
|
||||
of tkCharLit:
|
||||
dispA(result, "<span class=\"CharLit\">$1</span>", "\\spanCharLit{$1}",
|
||||
[rope(esc(d.target, literal))])
|
||||
of tkIntLit..tkUInt64Lit:
|
||||
dispA(result, "<span class=\"DecNumber\">$1</span>",
|
||||
"\\spanDecNumber{$1}", [rope(esc(d.target, literal))])
|
||||
of tkFloatLit..tkFloat128Lit:
|
||||
dispA(result, "<span class=\"FloatNumber\">$1</span>",
|
||||
"\\spanFloatNumber{$1}", [rope(esc(d.target, literal))])
|
||||
of tkSymbol:
|
||||
dispA(result, "<span class=\"Identifier\">$1</span>",
|
||||
"\\spanIdentifier{$1}", [rope(esc(d.target, literal))])
|
||||
of tkSpaces, tkInvalid:
|
||||
add(result, literal)
|
||||
of tkCurlyDotLe:
|
||||
dispA(result, """<span class="Other pragmabegin">$1</span><div class="pragma">""",
|
||||
"\\spanOther{$1}",
|
||||
[rope(esc(d.target, literal))])
|
||||
of tkCurlyDotRi:
|
||||
dispA(result, "</div><span class=\"Other pragmaend\">$1</span>",
|
||||
"\\spanOther{$1}",
|
||||
[rope(esc(d.target, literal))])
|
||||
of tkParLe, tkParRi, tkBracketLe, tkBracketRi, tkCurlyLe, tkCurlyRi,
|
||||
tkBracketDotLe, tkBracketDotRi, tkParDotLe,
|
||||
tkParDotRi, tkComma, tkSemiColon, tkColon, tkEquals, tkDot, tkDotDot,
|
||||
tkAccent, tkColonColon,
|
||||
tkGStrLit, tkGTripleStrLit, tkInfixOpr, tkPrefixOpr, tkPostfixOpr:
|
||||
dispA(result, "<span class=\"Other\">$1</span>", "\\spanOther{$1}",
|
||||
[rope(esc(d.target, literal))])
|
||||
nodeToHighlightedHtml(d, n, result, {renderNoBody, renderNoComments, renderDocComments})
|
||||
|
||||
inc(d.id)
|
||||
let
|
||||
@@ -520,12 +553,24 @@ proc genJsonItem(d: PDoc, n, nameNode: PNode, k: TSymKind): JsonNode =
|
||||
proc checkForFalse(n: PNode): bool =
|
||||
result = n.kind == nkIdent and cmpIgnoreStyle(n.ident.s, "false") == 0
|
||||
|
||||
proc traceDeps(d: PDoc, n: PNode) =
|
||||
proc traceDeps(d: PDoc, it: PNode) =
|
||||
const k = skModule
|
||||
if d.section[k] != nil: add(d.section[k], ", ")
|
||||
dispA(d.section[k],
|
||||
"<a class=\"reference external\" href=\"$1.html\">$1</a>",
|
||||
"$1", [rope(getModuleName(n))])
|
||||
|
||||
if it.kind == nkInfix and it.len == 3 and it[2].kind == nkBracket:
|
||||
let sep = it[0]
|
||||
let dir = it[1]
|
||||
let a = newNodeI(nkInfix, it.info)
|
||||
a.add sep
|
||||
a.add dir
|
||||
a.add sep # dummy entry, replaced in the loop
|
||||
for x in it[2]:
|
||||
a.sons[2] = x
|
||||
traceDeps(d, a)
|
||||
else:
|
||||
if d.section[k] != nil: add(d.section[k], ", ")
|
||||
dispA(d.section[k],
|
||||
"<a class=\"reference external\" href=\"$1.html\">$1</a>",
|
||||
"$1", [rope(getModuleName(it))])
|
||||
|
||||
proc generateDoc*(d: PDoc, n: PNode) =
|
||||
case n.kind
|
||||
@@ -608,6 +653,49 @@ proc generateJson*(d: PDoc, n: PNode) =
|
||||
generateJson(d, lastSon(n.sons[0]))
|
||||
else: discard
|
||||
|
||||
proc genTagsItem(d: PDoc, n, nameNode: PNode, k: TSymKind): string =
|
||||
result = getName(d, nameNode) & "\n"
|
||||
|
||||
proc generateTags*(d: PDoc, n: PNode, r: var Rope) =
|
||||
case n.kind
|
||||
of nkCommentStmt:
|
||||
if n.comment != nil and startsWith(n.comment, "##"):
|
||||
let stripped = n.comment.substr(2).strip
|
||||
r.add stripped
|
||||
of nkProcDef:
|
||||
when useEffectSystem: documentRaises(n)
|
||||
r.add genTagsItem(d, n, n.sons[namePos], skProc)
|
||||
of nkFuncDef:
|
||||
when useEffectSystem: documentRaises(n)
|
||||
r.add genTagsItem(d, n, n.sons[namePos], skFunc)
|
||||
of nkMethodDef:
|
||||
when useEffectSystem: documentRaises(n)
|
||||
r.add genTagsItem(d, n, n.sons[namePos], skMethod)
|
||||
of nkIteratorDef:
|
||||
when useEffectSystem: documentRaises(n)
|
||||
r.add genTagsItem(d, n, n.sons[namePos], skIterator)
|
||||
of nkMacroDef:
|
||||
r.add genTagsItem(d, n, n.sons[namePos], skMacro)
|
||||
of nkTemplateDef:
|
||||
r.add genTagsItem(d, n, n.sons[namePos], skTemplate)
|
||||
of nkConverterDef:
|
||||
when useEffectSystem: documentRaises(n)
|
||||
r.add genTagsItem(d, n, n.sons[namePos], skConverter)
|
||||
of nkTypeSection, nkVarSection, nkLetSection, nkConstSection:
|
||||
for i in countup(0, sonsLen(n) - 1):
|
||||
if n.sons[i].kind != nkCommentStmt:
|
||||
# order is always 'type var let const':
|
||||
r.add genTagsItem(d, n.sons[i], n.sons[i].sons[0],
|
||||
succ(skType, ord(n.kind)-ord(nkTypeSection)))
|
||||
of nkStmtList:
|
||||
for i in countup(0, sonsLen(n) - 1):
|
||||
generateTags(d, n.sons[i], r)
|
||||
of nkWhenStmt:
|
||||
# generate documentation for the first branch only:
|
||||
if not checkForFalse(n.sons[0].sons[0]):
|
||||
generateTags(d, lastSon(n.sons[0]), r)
|
||||
else: discard
|
||||
|
||||
proc genSection(d: PDoc, kind: TSymKind) =
|
||||
const sectionNames: array[skModule..skTemplate, string] = [
|
||||
"Imports", "Types", "Vars", "Lets", "Consts", "Vars", "Procs", "Funcs",
|
||||
@@ -712,6 +800,26 @@ proc commandDoc*() =
|
||||
proc commandRstAux(filename, outExt: string) =
|
||||
var filen = addFileExt(filename, "txt")
|
||||
var d = newDocumentor(filen, options.gConfigVars)
|
||||
d.onTestSnippet = proc (d: var RstGenerator; filename, cmd: string;
|
||||
status: int; content: string) =
|
||||
var outp: string
|
||||
if filename.len == 0:
|
||||
inc(d.id)
|
||||
let nameOnly = splitFile(d.filename).name
|
||||
let subdir = getNimcacheDir() / nameOnly
|
||||
createDir(subdir)
|
||||
outp = subdir / (nameOnly & "_snippet_" & $d.id & ".nim")
|
||||
elif isAbsolute(filename):
|
||||
outp = filename
|
||||
else:
|
||||
# Nim's convention: every path is relative to the file it was written in:
|
||||
outp = splitFile(d.filename).dir / filename
|
||||
writeFile(outp, content)
|
||||
let cmd = unescape(cmd) % quoteShell(outp)
|
||||
rawMessage(hintExecuting, cmd)
|
||||
if execShellCmd(cmd) != status:
|
||||
rawMessage(errExecutionOfProgramFailed, cmd)
|
||||
|
||||
d.isPureRst = true
|
||||
var rst = parseRst(readFile(filen), filen, 0, 1, d.hasToc,
|
||||
{roSupportRawDirective})
|
||||
@@ -745,6 +853,21 @@ proc commandJson*() =
|
||||
#echo getOutFile(gProjectFull, JsonExt)
|
||||
writeRope(content, getOutFile(gProjectFull, JsonExt), useWarning = false)
|
||||
|
||||
proc commandTags*() =
|
||||
var ast = parseFile(gProjectMainIdx, newIdentCache())
|
||||
if ast == nil: return
|
||||
var d = newDocumentor(gProjectFull, options.gConfigVars)
|
||||
d.hasToc = true
|
||||
var
|
||||
content: Rope
|
||||
generateTags(d, ast, content)
|
||||
|
||||
if optStdout in gGlobalOptions:
|
||||
writeRope(stdout, content)
|
||||
else:
|
||||
#echo getOutFile(gProjectFull, TagsExt)
|
||||
writeRope(content, getOutFile(gProjectFull, TagsExt), useWarning = false)
|
||||
|
||||
proc commandBuildIndex*() =
|
||||
var content = mergeIndexes(gProjectFull).rope
|
||||
|
||||
|
||||
@@ -225,7 +225,7 @@ proc pack(v: PNode, typ: PType, res: pointer) =
|
||||
awr(pointer, res +! sizeof(pointer))
|
||||
of tyArray:
|
||||
let baseSize = typ.sons[1].getSize
|
||||
for i in 0 .. <v.len:
|
||||
for i in 0 ..< v.len:
|
||||
pack(v.sons[i], typ.sons[1], res +! i * baseSize)
|
||||
of tyObject, tyTuple:
|
||||
packObject(v, typ, res)
|
||||
@@ -291,7 +291,7 @@ proc unpackArray(x: pointer, typ: PType, n: PNode): PNode =
|
||||
if result.kind != nkBracket:
|
||||
globalError(n.info, "cannot map value from FFI")
|
||||
let baseSize = typ.sons[1].getSize
|
||||
for i in 0 .. < result.len:
|
||||
for i in 0 ..< result.len:
|
||||
result.sons[i] = unpack(x +! i * baseSize, typ.sons[1], result.sons[i])
|
||||
|
||||
proc canonNodeKind(k: TNodeKind): TNodeKind =
|
||||
|
||||
@@ -42,7 +42,7 @@ proc evalTemplateAux(templ, actual: PNode, c: var TemplCtx, result: PNode) =
|
||||
s.kind == skType and s.typ != nil and s.typ.kind == tyGenericParam:
|
||||
handleParam actual.sons[s.owner.typ.len + s.position - 1]
|
||||
else:
|
||||
internalAssert sfGenSym in s.flags
|
||||
internalAssert sfGenSym in s.flags or s.kind == skType
|
||||
var x = PSym(idTableGet(c.mapping, s))
|
||||
if x == nil:
|
||||
x = copySym(s, false)
|
||||
@@ -77,7 +77,7 @@ proc evalTemplateArgs(n: PNode, s: PSym; fromHlo: bool): PNode =
|
||||
# now that we have working untyped parameters.
|
||||
genericParams = if sfImmediate in s.flags or fromHlo: 0
|
||||
else: s.ast[genericParamsPos].len
|
||||
expectedRegularParams = <s.typ.len
|
||||
expectedRegularParams = s.typ.len-1
|
||||
givenRegularParams = totalParams - genericParams
|
||||
if givenRegularParams < 0: givenRegularParams = 0
|
||||
|
||||
@@ -109,7 +109,7 @@ proc evalTemplateArgs(n: PNode, s: PSym; fromHlo: bool): PNode =
|
||||
var evalTemplateCounter* = 0
|
||||
# to prevent endless recursion in templates instantiation
|
||||
|
||||
proc wrapInComesFrom*(info: TLineInfo; res: PNode): PNode =
|
||||
proc wrapInComesFrom*(info: TLineInfo; sym: PSym; res: PNode): PNode =
|
||||
when true:
|
||||
result = res
|
||||
result.info = info
|
||||
@@ -124,8 +124,12 @@ proc wrapInComesFrom*(info: TLineInfo; res: PNode): PNode =
|
||||
if x[i].kind in nkCallKinds:
|
||||
x.sons[i].info = info
|
||||
else:
|
||||
result = newNodeI(nkPar, info)
|
||||
result = newNodeI(nkStmtListExpr, info)
|
||||
var d = newNodeI(nkComesFrom, info)
|
||||
d.add newSymNode(sym, info)
|
||||
result.add d
|
||||
result.add res
|
||||
result.typ = res.typ
|
||||
|
||||
proc evalTemplate*(n: PNode, tmpl, genSymOwner: PSym; fromHlo=false): PNode =
|
||||
inc(evalTemplateCounter)
|
||||
@@ -156,6 +160,6 @@ proc evalTemplate*(n: PNode, tmpl, genSymOwner: PSym; fromHlo=false): PNode =
|
||||
for i in countup(0, safeLen(body) - 1):
|
||||
evalTemplateAux(body.sons[i], args, ctx, result)
|
||||
result.flags.incl nfFromTemplate
|
||||
result = wrapInComesFrom(n.info, result)
|
||||
result = wrapInComesFrom(n.info, tmpl, result)
|
||||
dec(evalTemplateCounter)
|
||||
|
||||
|
||||
@@ -21,7 +21,7 @@ import
|
||||
type
|
||||
TSystemCC* = enum
|
||||
ccNone, ccGcc, ccLLVM_Gcc, ccCLang, ccLcc, ccBcc, ccDmc, ccWcc, ccVcc,
|
||||
ccTcc, ccPcc, ccUcc, ccIcl
|
||||
ccTcc, ccPcc, ccUcc, ccIcl, ccIcc
|
||||
TInfoCCProp* = enum # properties of the C compiler:
|
||||
hasSwitchRange, # CC allows ranges in switch statements (GNU C)
|
||||
hasComputedGoto, # CC has computed goto (GNU C extension)
|
||||
@@ -95,7 +95,11 @@ compiler llvmGcc:
|
||||
result.name = "llvm_gcc"
|
||||
result.compilerExe = "llvm-gcc"
|
||||
result.cppCompiler = "llvm-g++"
|
||||
result.buildLib = "llvm-ar rcs $libfile $objfiles"
|
||||
when defined(macosx):
|
||||
# OS X has no 'llvm-ar' tool:
|
||||
result.buildLib = "ar rcs $libfile $objfiles"
|
||||
else:
|
||||
result.buildLib = "llvm-ar rcs $libfile $objfiles"
|
||||
|
||||
# Clang (LLVM) C/C++ Compiler
|
||||
compiler clang:
|
||||
@@ -131,16 +135,18 @@ compiler vcc:
|
||||
|
||||
# Intel C/C++ Compiler
|
||||
compiler icl:
|
||||
# Intel compilers try to imitate the native ones (gcc and msvc)
|
||||
when defined(windows):
|
||||
result = vcc()
|
||||
else:
|
||||
result = gcc()
|
||||
|
||||
result = vcc()
|
||||
result.name = "icl"
|
||||
result.compilerExe = "icl"
|
||||
result.linkerExe = "icl"
|
||||
|
||||
# Intel compilers try to imitate the native ones (gcc and msvc)
|
||||
compiler icc:
|
||||
result = gcc()
|
||||
result.name = "icc"
|
||||
result.compilerExe = "icc"
|
||||
result.linkerExe = "icc"
|
||||
|
||||
# Local C Compiler
|
||||
compiler lcc:
|
||||
result = (
|
||||
@@ -247,7 +253,7 @@ compiler tcc:
|
||||
compilerExe: "tcc",
|
||||
cppCompiler: "",
|
||||
compileTmpl: "-c $options $include -o $objfile $file",
|
||||
buildGui: "UNAVAILABLE!",
|
||||
buildGui: "-Wl,-subsystem=gui",
|
||||
buildDll: " -shared",
|
||||
buildLib: "", # XXX: not supported yet
|
||||
linkerExe: "tcc",
|
||||
@@ -323,7 +329,8 @@ const
|
||||
tcc(),
|
||||
pcc(),
|
||||
ucc(),
|
||||
icl()]
|
||||
icl(),
|
||||
icc()]
|
||||
|
||||
hExt* = ".h"
|
||||
|
||||
@@ -724,13 +731,13 @@ proc execCmdsInParallel(cmds: seq[string]; prettyCb: proc (idx: int)) =
|
||||
else:
|
||||
tryExceptOSErrorMessage("invocation of external compiler program failed."):
|
||||
if optListCmd in gGlobalOptions or gVerbosity > 1:
|
||||
res = execProcesses(cmds, {poEchoCmd, poStdErrToStdOut, poUsePath, poParentStreams},
|
||||
res = execProcesses(cmds, {poEchoCmd, poStdErrToStdOut, poUsePath},
|
||||
gNumberOfProcessors, afterRunEvent=runCb)
|
||||
elif gVerbosity == 1:
|
||||
res = execProcesses(cmds, {poStdErrToStdOut, poUsePath, poParentStreams},
|
||||
res = execProcesses(cmds, {poStdErrToStdOut, poUsePath},
|
||||
gNumberOfProcessors, prettyCb, afterRunEvent=runCb)
|
||||
else:
|
||||
res = execProcesses(cmds, {poStdErrToStdOut, poUsePath, poParentStreams},
|
||||
res = execProcesses(cmds, {poStdErrToStdOut, poUsePath},
|
||||
gNumberOfProcessors, afterRunEvent=runCb)
|
||||
if res != 0:
|
||||
if gNumberOfProcessors <= 1:
|
||||
@@ -760,8 +767,9 @@ proc callCCompiler*(projectfile: string) =
|
||||
add(objfiles, quoteShell(
|
||||
addFileExt(objFile, CC[cCompiler].objExt)))
|
||||
for x in toCompile:
|
||||
let objFile = if noAbsolutePaths(): x.obj.extractFilename else: x.obj
|
||||
add(objfiles, ' ')
|
||||
add(objfiles, quoteShell(x.obj))
|
||||
add(objfiles, quoteShell(objFile))
|
||||
|
||||
linkCmd = getLinkCmd(projectfile, objfiles)
|
||||
if optCompileOnly notin gGlobalOptions:
|
||||
@@ -786,42 +794,40 @@ proc writeJsonBuildInstructions*(projectfile: string) =
|
||||
else:
|
||||
f.write escapeJson(x)
|
||||
|
||||
proc cfiles(f: File; buf: var string; list: CfileList, isExternal: bool) =
|
||||
var i = 0
|
||||
for it in list:
|
||||
proc cfiles(f: File; buf: var string; clist: CfileList, isExternal: bool) =
|
||||
var pastStart = false
|
||||
for it in clist:
|
||||
if CfileFlag.Cached in it.flags: continue
|
||||
let compileCmd = getCompileCFileCmd(it)
|
||||
if pastStart: lit "],\L"
|
||||
lit "["
|
||||
str it.cname
|
||||
lit ", "
|
||||
str compileCmd
|
||||
inc i
|
||||
if i == list.len:
|
||||
lit "]\L"
|
||||
else:
|
||||
lit "],\L"
|
||||
pastStart = true
|
||||
lit "]\L"
|
||||
|
||||
proc linkfiles(f: File; buf, objfiles: var string) =
|
||||
for i, it in externalToLink:
|
||||
let
|
||||
objFile = if noAbsolutePaths(): it.extractFilename else: it
|
||||
objStr = addFileExt(objFile, CC[cCompiler].objExt)
|
||||
proc linkfiles(f: File; buf, objfiles: var string; clist: CfileList;
|
||||
llist: seq[string]) =
|
||||
var pastStart = false
|
||||
for it in llist:
|
||||
let objfile = if noAbsolutePaths(): it.extractFilename
|
||||
else: it
|
||||
let objstr = addFileExt(objfile, CC[cCompiler].objExt)
|
||||
add(objfiles, ' ')
|
||||
add(objfiles, objStr)
|
||||
str objStr
|
||||
if toCompile.len == 0 and i == externalToLink.high:
|
||||
lit "\L"
|
||||
else:
|
||||
lit ",\L"
|
||||
for i, x in toCompile:
|
||||
let objStr = quoteShell(x.obj)
|
||||
add(objfiles, objstr)
|
||||
if pastStart: lit ",\L"
|
||||
str objstr
|
||||
pastStart = true
|
||||
|
||||
for it in clist:
|
||||
let objstr = quoteShell(it.obj)
|
||||
add(objfiles, ' ')
|
||||
add(objfiles, objStr)
|
||||
str objStr
|
||||
if i == toCompile.high:
|
||||
lit "\L"
|
||||
else:
|
||||
lit ",\L"
|
||||
add(objfiles, objstr)
|
||||
if pastStart: lit ",\L"
|
||||
str objstr
|
||||
pastStart = true
|
||||
lit "\L"
|
||||
|
||||
var buf = newStringOfCap(50)
|
||||
|
||||
@@ -835,7 +841,7 @@ proc writeJsonBuildInstructions*(projectfile: string) =
|
||||
lit "],\L\"link\":[\L"
|
||||
var objfiles = ""
|
||||
# XXX add every file here that is to link
|
||||
linkfiles(f, buf, objfiles)
|
||||
linkfiles(f, buf, objfiles, toCompile, externalToLink)
|
||||
|
||||
lit "],\L\"linkcmd\": "
|
||||
str getLinkCmd(projectfile, objfiles)
|
||||
|
||||
@@ -45,13 +45,13 @@ proc counterInTree(n, loop: PNode; counter: PSym): bool =
|
||||
for it in n:
|
||||
if counterInTree(it.lastSon): return true
|
||||
else:
|
||||
for i in 0 .. <safeLen(n):
|
||||
for i in 0 ..< safeLen(n):
|
||||
if counterInTree(n[i], loop, counter): return true
|
||||
|
||||
proc copyExcept(n: PNode, x, dest: PNode) =
|
||||
if x == n: return
|
||||
if n.kind in {nkStmtList, nkStmtListExpr}:
|
||||
for i in 0 .. <n.len: copyExcept(n[i], x, dest)
|
||||
for i in 0 ..< n.len: copyExcept(n[i], x, dest)
|
||||
else:
|
||||
dest.add n
|
||||
|
||||
|
||||
@@ -7,8 +7,7 @@
|
||||
# distribution, for details about the copyright.
|
||||
#
|
||||
|
||||
## Module that implements ``gorge`` for the compiler as well as
|
||||
## the scriptable import mechanism.
|
||||
## Module that implements ``gorge`` for the compiler.
|
||||
|
||||
import msgs, securehash, os, osproc, streams, strutils, options
|
||||
|
||||
@@ -56,28 +55,3 @@ proc opGorge*(cmd, input, cache: string, info: TLineInfo): (string, int) =
|
||||
result = p.readOutput
|
||||
except IOError, OSError:
|
||||
result = ("", -1)
|
||||
|
||||
proc scriptableImport*(pkg, subdir: string; info: TLineInfo): string =
|
||||
var cmd = getConfigVar("resolver.exe")
|
||||
if cmd.len == 0: cmd = "nimresolve"
|
||||
else: cmd = quoteShell(cmd)
|
||||
cmd.add " --source:"
|
||||
cmd.add quoteShell(info.toFullPath())
|
||||
cmd.add " --stdlib:"
|
||||
cmd.add quoteShell(options.libpath)
|
||||
cmd.add " --project:"
|
||||
cmd.add quoteShell(gProjectFull)
|
||||
if subdir.len != 0:
|
||||
cmd.add " --subdir:"
|
||||
cmd.add quoteShell(subdir)
|
||||
if options.gNoNimblePath:
|
||||
cmd.add " --nonimblepath"
|
||||
cmd.add ' '
|
||||
cmd.add quoteShell(pkg)
|
||||
let (res, exitCode) = opGorge(cmd, "", cmd, info)
|
||||
if exitCode == 0:
|
||||
result = res.strip()
|
||||
elif res.len > 0:
|
||||
localError(info, res)
|
||||
else:
|
||||
localError(info, "cannot resolve: " & (pkg / subdir))
|
||||
|
||||
@@ -52,7 +52,7 @@ proc isLet(n: PNode): bool =
|
||||
|
||||
proc isVar(n: PNode): bool =
|
||||
n.kind == nkSym and n.sym.kind in {skResult, skVar} and
|
||||
{sfGlobal, sfAddrTaken} * n.sym.flags == {}
|
||||
{sfAddrTaken} * n.sym.flags == {}
|
||||
|
||||
proc isLetLocation(m: PNode, isApprox: bool): bool =
|
||||
# consider: 'n[].kind' --> we really need to support 1 deref op even if this
|
||||
@@ -247,7 +247,7 @@ proc canon*(n: PNode): PNode =
|
||||
# XXX for now only the new code in 'semparallel' uses this
|
||||
if n.safeLen >= 1:
|
||||
result = shallowCopy(n)
|
||||
for i in 0 .. < n.len:
|
||||
for i in 0 ..< n.len:
|
||||
result.sons[i] = canon(n.sons[i])
|
||||
elif n.kind == nkSym and n.sym.kind == skLet and
|
||||
n.sym.ast.getMagic in (someEq + someAdd + someMul + someMin +
|
||||
@@ -768,8 +768,10 @@ macro `=~`(x: PNode, pat: untyped): bool =
|
||||
|
||||
var conds = newTree(nnkBracket)
|
||||
m(x, pat, conds)
|
||||
result = nestList(!"and", conds)
|
||||
|
||||
when declared(macros.toNimIdent):
|
||||
result = nestList(toNimIdent"and", conds)
|
||||
else:
|
||||
result = nestList(!"and", conds)
|
||||
|
||||
proc isMinusOne(n: PNode): bool =
|
||||
n.kind in {nkCharLit..nkUInt64Lit} and n.intVal == -1
|
||||
|
||||
@@ -36,7 +36,7 @@ proc applyPatterns(c: PContext, n: PNode): PNode =
|
||||
# we apply the last pattern first, so that pattern overriding is possible;
|
||||
# however the resulting AST would better not trigger the old rule then
|
||||
# anymore ;-)
|
||||
for i in countdown(<c.patterns.len, 0):
|
||||
for i in countdown(c.patterns.len-1, 0):
|
||||
let pattern = c.patterns[i]
|
||||
if not isNil(pattern):
|
||||
let x = applyRule(c, pattern, result)
|
||||
@@ -75,7 +75,7 @@ proc hlo(c: PContext, n: PNode): PNode =
|
||||
result = applyPatterns(c, n)
|
||||
if result == n:
|
||||
# no optimization applied, try subtrees:
|
||||
for i in 0 .. < safeLen(result):
|
||||
for i in 0 ..< safeLen(result):
|
||||
let a = result.sons[i]
|
||||
let h = hlo(c, a)
|
||||
if h != a: result.sons[i] = h
|
||||
|
||||
@@ -11,83 +11,11 @@
|
||||
|
||||
import
|
||||
intsets, strutils, os, ast, astalgo, msgs, options, idents, rodread, lookups,
|
||||
semdata, passes, renderer, gorgeimpl
|
||||
semdata, passes, renderer, modulepaths
|
||||
|
||||
proc evalImport*(c: PContext, n: PNode): PNode
|
||||
proc evalFrom*(c: PContext, n: PNode): PNode
|
||||
|
||||
proc lookupPackage(pkg, subdir: PNode): string =
|
||||
let sub = if subdir != nil: renderTree(subdir, {renderNoComments}).replace(" ") else: ""
|
||||
case pkg.kind
|
||||
of nkStrLit, nkRStrLit, nkTripleStrLit:
|
||||
result = scriptableImport(pkg.strVal, sub, pkg.info)
|
||||
of nkIdent:
|
||||
result = scriptableImport(pkg.ident.s, sub, pkg.info)
|
||||
else:
|
||||
localError(pkg.info, "package name must be an identifier or string literal")
|
||||
result = ""
|
||||
|
||||
proc getModuleName*(n: PNode): string =
|
||||
# This returns a short relative module name without the nim extension
|
||||
# e.g. like "system", "importer" or "somepath/module"
|
||||
# The proc won't perform any checks that the path is actually valid
|
||||
case n.kind
|
||||
of nkStrLit, nkRStrLit, nkTripleStrLit:
|
||||
try:
|
||||
result = pathSubs(n.strVal, n.info.toFullPath().splitFile().dir)
|
||||
except ValueError:
|
||||
localError(n.info, "invalid path: " & n.strVal)
|
||||
result = n.strVal
|
||||
of nkIdent:
|
||||
result = n.ident.s
|
||||
of nkSym:
|
||||
result = n.sym.name.s
|
||||
of nkInfix:
|
||||
let n0 = n[0]
|
||||
let n1 = n[1]
|
||||
if n0.kind == nkIdent and n0.ident.id == getIdent("as").id:
|
||||
# XXX hack ahead:
|
||||
n.kind = nkImportAs
|
||||
n.sons[0] = n.sons[1]
|
||||
n.sons[1] = n.sons[2]
|
||||
n.sons.setLen(2)
|
||||
return getModuleName(n.sons[0])
|
||||
if n1.kind == nkPrefix and n1[0].kind == nkIdent and n1[0].ident.s == "$":
|
||||
if n0.kind == nkIdent and n0.ident.s == "/":
|
||||
result = lookupPackage(n1[1], n[2])
|
||||
else:
|
||||
localError(n.info, "only '/' supported with $package notation")
|
||||
result = ""
|
||||
else:
|
||||
# hacky way to implement 'x / y /../ z':
|
||||
result = getModuleName(n1)
|
||||
result.add renderTree(n0, {renderNoComments})
|
||||
result.add getModuleName(n[2])
|
||||
of nkPrefix:
|
||||
if n.sons[0].kind == nkIdent and n.sons[0].ident.s == "$":
|
||||
result = lookupPackage(n[1], nil)
|
||||
else:
|
||||
# hacky way to implement 'x / y /../ z':
|
||||
result = renderTree(n, {renderNoComments}).replace(" ")
|
||||
of nkDotExpr:
|
||||
result = renderTree(n, {renderNoComments}).replace(".", "/")
|
||||
of nkImportAs:
|
||||
result = getModuleName(n.sons[0])
|
||||
else:
|
||||
localError(n.info, errGenerated, "invalid module name: '$1'" % n.renderTree)
|
||||
result = ""
|
||||
|
||||
proc checkModuleName*(n: PNode; doLocalError=true): int32 =
|
||||
# This returns the full canonical path for a given module import
|
||||
let modulename = n.getModuleName
|
||||
let fullPath = findModule(modulename, n.info.toFullPath)
|
||||
if fullPath.len == 0:
|
||||
if doLocalError:
|
||||
localError(n.info, errCannotOpenFile, modulename)
|
||||
result = InvalidFileIDX
|
||||
else:
|
||||
result = fullPath.fileInfoIdx
|
||||
|
||||
proc importPureEnumField*(c: PContext; s: PSym) =
|
||||
var check = strTableGet(c.importTable.symbols, s.name)
|
||||
if check == nil:
|
||||
@@ -99,7 +27,7 @@ proc rawImportSymbol(c: PContext, s: PSym) =
|
||||
# check if we have already a symbol of the same name:
|
||||
var check = strTableGet(c.importTable.symbols, s.name)
|
||||
if check != nil and check.id != s.id:
|
||||
if s.kind notin OverloadableSyms:
|
||||
if s.kind notin OverloadableSyms or check.kind notin OverloadableSyms:
|
||||
# s and check need to be qualified:
|
||||
incl(c.ambiguousSymbols, s.id)
|
||||
incl(c.ambiguousSymbols, check.id)
|
||||
|
||||
@@ -191,11 +191,12 @@ proc mapType(typ: PType): TJSTypeKind =
|
||||
of tyObject, tyArray, tyTuple, tyOpenArray, tyVarargs:
|
||||
result = etyObject
|
||||
of tyNil: result = etyNull
|
||||
of tyGenericInst, tyGenericParam, tyGenericBody, tyGenericInvocation,
|
||||
of tyGenericParam, tyGenericBody, tyGenericInvocation,
|
||||
tyNone, tyFromExpr, tyForward, tyEmpty,
|
||||
tyExpr, tyStmt, tyTypeDesc, tyTypeClasses, tyVoid, tyAlias:
|
||||
tyExpr, tyStmt, tyTypeDesc, tyBuiltInTypeClass, tyCompositeTypeClass,
|
||||
tyAnd, tyOr, tyNot, tyAnything, tyVoid:
|
||||
result = etyNone
|
||||
of tyInferred:
|
||||
of tyGenericInst, tyInferred, tyAlias, tyUserTypeClass, tyUserTypeClassInst:
|
||||
result = mapType(typ.lastSon)
|
||||
of tyStatic:
|
||||
if t.n != nil: result = mapType(lastSon t)
|
||||
@@ -376,8 +377,8 @@ const # magic checked op; magic unchecked op; checked op; unchecked op
|
||||
["addInt", "", "addInt($1, $2)", "($1 + $2)"], # AddI
|
||||
["subInt", "", "subInt($1, $2)", "($1 - $2)"], # SubI
|
||||
["mulInt", "", "mulInt($1, $2)", "($1 * $2)"], # MulI
|
||||
["divInt", "", "divInt($1, $2)", "Math.floor($1 / $2)"], # DivI
|
||||
["modInt", "", "modInt($1, $2)", "Math.floor($1 % $2)"], # ModI
|
||||
["divInt", "", "divInt($1, $2)", "Math.trunc($1 / $2)"], # DivI
|
||||
["modInt", "", "modInt($1, $2)", "Math.trunc($1 % $2)"], # ModI
|
||||
["addInt", "", "addInt($1, $2)", "($1 + $2)"], # Succ
|
||||
["subInt", "", "subInt($1, $2)", "($1 - $2)"], # Pred
|
||||
["", "", "($1 + $2)", "($1 + $2)"], # AddF64
|
||||
@@ -444,8 +445,8 @@ const # magic checked op; magic unchecked op; checked op; unchecked op
|
||||
["toU32", "toU32", "toU32($1)", "toU32($1)"], # toU32
|
||||
["", "", "$1", "$1"], # ToFloat
|
||||
["", "", "$1", "$1"], # ToBiggestFloat
|
||||
["", "", "Math.floor($1)", "Math.floor($1)"], # ToInt
|
||||
["", "", "Math.floor($1)", "Math.floor($1)"], # ToBiggestInt
|
||||
["", "", "Math.trunc($1)", "Math.trunc($1)"], # ToInt
|
||||
["", "", "Math.trunc($1)", "Math.trunc($1)"], # ToBiggestInt
|
||||
["nimCharToStr", "nimCharToStr", "nimCharToStr($1)", "nimCharToStr($1)"],
|
||||
["nimBoolToStr", "nimBoolToStr", "nimBoolToStr($1)", "nimBoolToStr($1)"],
|
||||
["cstrToNimstr", "cstrToNimstr", "cstrToNimstr(($1)+\"\")", "cstrToNimstr(($1)+\"\")"],
|
||||
@@ -1067,7 +1068,7 @@ proc genArrayAddr(p: PProc, n: PNode, r: var TCompRes) =
|
||||
else:
|
||||
r.res = "chckIndx($1, $2, strlen($3))-$2" % [b.res, rope(first), a.res]
|
||||
else:
|
||||
r.res = "chckIndx($1, $2, $3.length-1)-$2" % [b.res, rope(first), a.res]
|
||||
r.res = "chckIndx($1, $2, $3.length+$2-1)-$2" % [b.res, rope(first), a.res]
|
||||
elif first != 0:
|
||||
r.res = "($1)-$2" % [b.res, rope(first)]
|
||||
else:
|
||||
@@ -1363,7 +1364,7 @@ proc genPatternCall(p: PProc; n: PNode; pat: string; typ: PType;
|
||||
case pat[i]
|
||||
of '@':
|
||||
var generated = 0
|
||||
for k in j .. < n.len:
|
||||
for k in j ..< n.len:
|
||||
if generated > 0: add(r.res, ", ")
|
||||
genOtherArg(p, n, k, typ, generated, r)
|
||||
inc i
|
||||
@@ -1528,7 +1529,7 @@ proc createVar(p: PProc, typ: PType, indirect: bool): Rope =
|
||||
of tyTuple:
|
||||
if p.target == targetJS:
|
||||
result = rope("{")
|
||||
for i in 0.. <t.sonsLen:
|
||||
for i in 0..<t.sonsLen:
|
||||
if i > 0: add(result, ", ")
|
||||
addf(result, "Field$1: $2", [i.rope,
|
||||
createVar(p, t.sons[i], false)])
|
||||
@@ -1536,7 +1537,7 @@ proc createVar(p: PProc, typ: PType, indirect: bool): Rope =
|
||||
if indirect: result = "[$1]" % [result]
|
||||
else:
|
||||
result = rope("array(")
|
||||
for i in 0.. <t.sonsLen:
|
||||
for i in 0..<t.sonsLen:
|
||||
if i > 0: add(result, ", ")
|
||||
add(result, createVar(p, t.sons[i], false))
|
||||
add(result, ")")
|
||||
@@ -1562,14 +1563,22 @@ proc createVar(p: PProc, typ: PType, indirect: bool): Rope =
|
||||
internalError("createVar: " & $t.kind)
|
||||
result = nil
|
||||
|
||||
template returnType: untyped =
|
||||
~""
|
||||
|
||||
proc genVarInit(p: PProc, v: PSym, n: PNode) =
|
||||
var
|
||||
a: TCompRes
|
||||
s: Rope
|
||||
varCode: string
|
||||
if v.constraint.isNil:
|
||||
varCode = "var $2"
|
||||
else:
|
||||
varCode = v.constraint.strVal
|
||||
if n.kind == nkEmpty:
|
||||
let mname = mangleName(v, p.target)
|
||||
lineF(p, "var $1 = $2;$n" | "$$$1 = $2;$n",
|
||||
[mname, createVar(p, v.typ, isIndirect(v))])
|
||||
lineF(p, varCode & " = $3;$n" | "$$$2 = $3;$n",
|
||||
[returnType, mname, createVar(p, v.typ, isIndirect(v))])
|
||||
if v.typ.kind in { tyVar, tyPtr, tyRef } and mapType(p, v.typ) == etyBaseIndex:
|
||||
lineF(p, "var $1_Idx = 0;$n", [ mname ])
|
||||
else:
|
||||
@@ -1586,25 +1595,25 @@ proc genVarInit(p: PProc, v: PSym, n: PNode) =
|
||||
let targetBaseIndex = {sfAddrTaken, sfGlobal} * v.flags == {}
|
||||
if a.typ == etyBaseIndex:
|
||||
if targetBaseIndex:
|
||||
lineF(p, "var $1 = $2, $1_Idx = $3;$n",
|
||||
[v.loc.r, a.address, a.res])
|
||||
lineF(p, varCode & " = $3, $2_Idx = $4;$n",
|
||||
[returnType, v.loc.r, a.address, a.res])
|
||||
else:
|
||||
lineF(p, "var $1 = [$2, $3];$n",
|
||||
[v.loc.r, a.address, a.res])
|
||||
lineF(p, varCode & " = [$3, $4];$n",
|
||||
[returnType, v.loc.r, a.address, a.res])
|
||||
else:
|
||||
if targetBaseIndex:
|
||||
let tmp = p.getTemp
|
||||
lineF(p, "var $1 = $2, $3 = $1[0], $3_Idx = $1[1];$n",
|
||||
[tmp, a.res, v.loc.r])
|
||||
else:
|
||||
lineF(p, "var $1 = $2;$n", [v.loc.r, a.res])
|
||||
lineF(p, varCode & " = $3;$n", [returnType, v.loc.r, a.res])
|
||||
return
|
||||
else:
|
||||
s = a.res
|
||||
if isIndirect(v):
|
||||
lineF(p, "var $1 = [$2];$n", [v.loc.r, s])
|
||||
lineF(p, varCode & " = [$3];$n", [returnType, v.loc.r, s])
|
||||
else:
|
||||
lineF(p, "var $1 = $2;$n" | "$$$1 = $2;$n", [v.loc.r, s])
|
||||
lineF(p, varCode & " = $3;$n" | "$$$2 = $3;$n", [returnType, v.loc.r, s])
|
||||
|
||||
proc genVarStmt(p: PProc, n: PNode) =
|
||||
for i in countup(0, sonsLen(n) - 1):
|
||||
@@ -1650,7 +1659,7 @@ proc genNewSeq(p: PProc, n: PNode) =
|
||||
|
||||
proc genOrd(p: PProc, n: PNode, r: var TCompRes) =
|
||||
case skipTypes(n.sons[1].typ, abstractVar).kind
|
||||
of tyEnum, tyInt..tyInt64, tyChar: gen(p, n.sons[1], r)
|
||||
of tyEnum, tyInt..tyUInt64, tyChar: gen(p, n.sons[1], r)
|
||||
of tyBool: unaryExpr(p, n, r, "", "($1 ? 1:0)")
|
||||
else: internalError(n.info, "genOrd")
|
||||
|
||||
@@ -2042,10 +2051,10 @@ proc genConv(p: PProc, n: PNode, r: var TCompRes) =
|
||||
return
|
||||
case dest.kind:
|
||||
of tyBool:
|
||||
r.res = "(($1)? 1:0)" % [r.res]
|
||||
r.res = "(!!($1))" % [r.res]
|
||||
r.kind = resExpr
|
||||
of tyInt:
|
||||
r.res = "($1|0)" % [r.res]
|
||||
r.res = "(($1)|0)" % [r.res]
|
||||
else:
|
||||
# TODO: What types must we handle here?
|
||||
discard
|
||||
@@ -2161,8 +2170,22 @@ proc genProc(oldProc: PProc, prc: PSym): Rope =
|
||||
returnStmt = "return $#;$n" % [a.res]
|
||||
|
||||
p.nested: genStmt(p, prc.getBody)
|
||||
let def = "function $#($#) {$n$#$#$#$#$#" %
|
||||
[name, header,
|
||||
|
||||
var def: Rope
|
||||
if not prc.constraint.isNil:
|
||||
def = (prc.constraint.strVal & " {$n$#$#$#$#$#") %
|
||||
[ returnType,
|
||||
name,
|
||||
header,
|
||||
optionaLine(p.globals),
|
||||
optionaLine(p.locals),
|
||||
optionaLine(resultAsgn),
|
||||
optionaLine(genProcBody(p, prc)),
|
||||
optionaLine(p.indentLine(returnStmt))]
|
||||
else:
|
||||
def = "function $#($#) {$n$#$#$#$#$#" %
|
||||
[ name,
|
||||
header,
|
||||
optionaLine(p.globals),
|
||||
optionaLine(p.locals),
|
||||
optionaLine(resultAsgn),
|
||||
@@ -2229,7 +2252,7 @@ proc gen(p: PProc, n: PNode, r: var TCompRes) =
|
||||
case n.kind
|
||||
of nkSym:
|
||||
genSym(p, n, r)
|
||||
of nkCharLit..nkUInt32Lit:
|
||||
of nkCharLit..nkUInt64Lit:
|
||||
if n.typ.kind == tyBool:
|
||||
r.res = if n.intVal == 0: rope"false" else: rope"true"
|
||||
else:
|
||||
@@ -2346,6 +2369,8 @@ proc gen(p: PProc, n: PNode, r: var TCompRes) =
|
||||
of nkGotoState, nkState:
|
||||
internalError(n.info, "first class iterators not implemented")
|
||||
of nkPragmaBlock: gen(p, n.lastSon, r)
|
||||
of nkComesFrom:
|
||||
discard "XXX to implement for better stack traces"
|
||||
else: internalError(n.info, "gen: unknown node type: " & $n.kind)
|
||||
|
||||
var globals: PGlobals
|
||||
|
||||
@@ -84,7 +84,7 @@ proc genObjectInfo(p: PProc, typ: PType, name: Rope) =
|
||||
|
||||
proc genTupleFields(p: PProc, typ: PType): Rope =
|
||||
var s: Rope = nil
|
||||
for i in 0 .. <typ.len:
|
||||
for i in 0 ..< typ.len:
|
||||
if i > 0: add(s, ", " & tnl)
|
||||
s.addf("{kind: 1, offset: \"Field$1\", len: 0, " &
|
||||
"typ: $2, name: \"Field$1\", sons: null}",
|
||||
|
||||
@@ -455,6 +455,7 @@ type
|
||||
LiftingPass = object
|
||||
processed: IntSet
|
||||
envVars: Table[int, PNode]
|
||||
inContainer: int
|
||||
|
||||
proc initLiftingPass(fn: PSym): LiftingPass =
|
||||
result.processed = initIntSet()
|
||||
@@ -597,6 +598,8 @@ proc liftCapturedVars(n: PNode; owner: PSym; d: DetectionPass;
|
||||
|
||||
proc transformYield(n: PNode; owner: PSym; d: DetectionPass;
|
||||
c: var LiftingPass): PNode =
|
||||
if c.inContainer > 0:
|
||||
localError(n.info, "invalid control flow: 'yield' within a constructor")
|
||||
let state = getStateField(owner)
|
||||
assert state != nil
|
||||
assert state.typ != nil
|
||||
@@ -703,11 +706,14 @@ proc liftCapturedVars(n: PNode; owner: PSym; d: DetectionPass;
|
||||
if not c.processed.containsOrIncl(s.id):
|
||||
#if s.name.s == "temp":
|
||||
# echo renderTree(s.getBody, {renderIds})
|
||||
let oldInContainer = c.inContainer
|
||||
c.inContainer = 0
|
||||
let body = wrapIterBody(liftCapturedVars(s.getBody, s, d, c), s)
|
||||
if c.envvars.getOrDefault(s.id).isNil:
|
||||
s.ast.sons[bodyPos] = body
|
||||
else:
|
||||
s.ast.sons[bodyPos] = newTree(nkStmtList, rawClosureCreation(s, d, c), body)
|
||||
c.inContainer = oldInContainer
|
||||
if s.typ.callConv == ccClosure:
|
||||
result = symToClosure(n, owner, d, c)
|
||||
elif s.id in d.capturedVars:
|
||||
@@ -717,7 +723,7 @@ proc liftCapturedVars(n: PNode; owner: PSym; d: DetectionPass;
|
||||
result = accessViaEnvParam(n, owner)
|
||||
else:
|
||||
result = accessViaEnvVar(n, owner, d, c)
|
||||
of nkEmpty..pred(nkSym), succ(nkSym)..nkNilLit,
|
||||
of nkEmpty..pred(nkSym), succ(nkSym)..nkNilLit, nkComesFrom,
|
||||
nkTemplateDef, nkTypeSection:
|
||||
discard
|
||||
of nkProcDef, nkMethodDef, nkConverterDef, nkMacroDef:
|
||||
@@ -733,9 +739,12 @@ proc liftCapturedVars(n: PNode; owner: PSym; d: DetectionPass;
|
||||
n.sons[1] = x.sons[1]
|
||||
of nkLambdaKinds, nkIteratorDef, nkFuncDef:
|
||||
if n.typ != nil and n[namePos].kind == nkSym:
|
||||
let oldInContainer = c.inContainer
|
||||
c.inContainer = 0
|
||||
let m = newSymNode(n[namePos].sym)
|
||||
m.typ = n.typ
|
||||
result = liftCapturedVars(m, owner, d, c)
|
||||
c.inContainer = oldInContainer
|
||||
of nkHiddenStdConv:
|
||||
if n.len == 2:
|
||||
n.sons[1] = liftCapturedVars(n[1], owner, d, c)
|
||||
@@ -750,8 +759,12 @@ proc liftCapturedVars(n: PNode; owner: PSym; d: DetectionPass;
|
||||
# special case 'when nimVm' due to bug #3636:
|
||||
n.sons[1] = liftCapturedVars(n[1], owner, d, c)
|
||||
return
|
||||
|
||||
let inContainer = n.kind in {nkObjConstr, nkBracket}
|
||||
if inContainer: inc c.inContainer
|
||||
for i in 0..<n.len:
|
||||
n.sons[i] = liftCapturedVars(n[i], owner, d, c)
|
||||
if inContainer: dec c.inContainer
|
||||
|
||||
# ------------------ old stuff -------------------------------------------
|
||||
|
||||
@@ -764,7 +777,10 @@ proc semCaptureSym*(s, owner: PSym) =
|
||||
var o = owner.skipGenericOwner
|
||||
while o.kind != skModule and o != nil:
|
||||
if s.owner == o:
|
||||
owner.typ.callConv = ccClosure
|
||||
if owner.typ.callConv in {ccClosure, ccDefault} or owner.kind == skIterator:
|
||||
owner.typ.callConv = ccClosure
|
||||
else:
|
||||
discard "do not produce an error here, but later"
|
||||
#echo "computing .closure for ", owner.name.s, " ", owner.info, " because of ", s.name.s
|
||||
o = o.skipGenericOwner
|
||||
# since the analysis is not entirely correct, we don't set 'tfCapturesEnv'
|
||||
|
||||
@@ -33,13 +33,13 @@ type
|
||||
TTokType* = enum
|
||||
tkInvalid, tkEof, # order is important here!
|
||||
tkSymbol, # keywords:
|
||||
tkAddr, tkAnd, tkAs, tkAsm, tkAtomic,
|
||||
tkAddr, tkAnd, tkAs, tkAsm,
|
||||
tkBind, tkBlock, tkBreak, tkCase, tkCast,
|
||||
tkConcept, tkConst, tkContinue, tkConverter,
|
||||
tkDefer, tkDiscard, tkDistinct, tkDiv, tkDo,
|
||||
tkElif, tkElse, tkEnd, tkEnum, tkExcept, tkExport,
|
||||
tkFinally, tkFor, tkFrom, tkFunc,
|
||||
tkGeneric, tkIf, tkImport, tkIn, tkInclude, tkInterface,
|
||||
tkIf, tkImport, tkIn, tkInclude, tkInterface,
|
||||
tkIs, tkIsnot, tkIterator,
|
||||
tkLet,
|
||||
tkMacro, tkMethod, tkMixin, tkMod, tkNil, tkNot, tkNotin,
|
||||
@@ -75,12 +75,12 @@ const
|
||||
tokKeywordHigh* = pred(tkIntLit)
|
||||
TokTypeToStr*: array[TTokType, string] = ["tkInvalid", "[EOF]",
|
||||
"tkSymbol",
|
||||
"addr", "and", "as", "asm", "atomic",
|
||||
"addr", "and", "as", "asm",
|
||||
"bind", "block", "break", "case", "cast",
|
||||
"concept", "const", "continue", "converter",
|
||||
"defer", "discard", "distinct", "div", "do",
|
||||
"elif", "else", "end", "enum", "except", "export",
|
||||
"finally", "for", "from", "func", "generic", "if",
|
||||
"finally", "for", "from", "func", "if",
|
||||
"import", "in", "include", "interface", "is", "isnot", "iterator",
|
||||
"let",
|
||||
"macro", "method", "mixin", "mod",
|
||||
@@ -129,6 +129,7 @@ type
|
||||
when defined(nimpretty):
|
||||
offsetA*, offsetB*: int # used for pretty printing so that literals
|
||||
# like 0b01 or r"\L" are unaffected
|
||||
commentOffsetA*, commentOffsetB*: int
|
||||
|
||||
TErrorHandler* = proc (info: TLineInfo; msg: TMsgKind; arg: string)
|
||||
TLexer* = object of TBaseLexer
|
||||
@@ -144,6 +145,10 @@ type
|
||||
when defined(nimsuggest):
|
||||
previousToken: TLineInfo
|
||||
|
||||
when defined(nimpretty):
|
||||
var
|
||||
gIndentationWidth*: int
|
||||
|
||||
var gLinesCompiled*: int # all lines that have been compiled
|
||||
|
||||
proc getLineInfo*(L: TLexer, tok: TToken): TLineInfo {.inline.} =
|
||||
@@ -151,6 +156,8 @@ proc getLineInfo*(L: TLexer, tok: TToken): TLineInfo {.inline.} =
|
||||
when defined(nimpretty):
|
||||
result.offsetA = tok.offsetA
|
||||
result.offsetB = tok.offsetB
|
||||
result.commentOffsetA = tok.commentOffsetA
|
||||
result.commentOffsetB = tok.commentOffsetB
|
||||
|
||||
proc isKeyword*(kind: TTokType): bool =
|
||||
result = (kind >= tokKeywordLow) and (kind <= tokKeywordHigh)
|
||||
@@ -198,6 +205,9 @@ proc initToken*(L: var TToken) =
|
||||
L.fNumber = 0.0
|
||||
L.base = base10
|
||||
L.ident = nil
|
||||
when defined(nimpretty):
|
||||
L.commentOffsetA = 0
|
||||
L.commentOffsetB = 0
|
||||
|
||||
proc fillToken(L: var TToken) =
|
||||
L.tokType = tkInvalid
|
||||
@@ -208,6 +218,9 @@ proc fillToken(L: var TToken) =
|
||||
L.fNumber = 0.0
|
||||
L.base = base10
|
||||
L.ident = nil
|
||||
when defined(nimpretty):
|
||||
L.commentOffsetA = 0
|
||||
L.commentOffsetB = 0
|
||||
|
||||
proc openLexer*(lex: var TLexer, fileIdx: int32, inputstream: PLLStream;
|
||||
cache: IdentCache) =
|
||||
@@ -680,7 +693,7 @@ proc getEscapedChar(L: var TLexer, tok: var TToken) =
|
||||
proc newString(s: cstring, len: int): string =
|
||||
## XXX, how come there is no support for this?
|
||||
result = newString(len)
|
||||
for i in 0 .. <len:
|
||||
for i in 0 ..< len:
|
||||
result[i] = s[i]
|
||||
|
||||
proc handleCRLF(L: var TLexer, pos: int): int =
|
||||
@@ -847,6 +860,23 @@ proc getOperator(L: var TLexer, tok: var TToken) =
|
||||
if buf[pos] in {CR, LF, nimlexbase.EndOfFile}:
|
||||
tok.strongSpaceB = -1
|
||||
|
||||
proc newlineFollows*(L: var TLexer): bool =
|
||||
var pos = L.bufpos
|
||||
var buf = L.buf
|
||||
while true:
|
||||
case buf[pos]
|
||||
of ' ', '\t':
|
||||
inc(pos)
|
||||
of CR, LF:
|
||||
result = true
|
||||
break
|
||||
of '#':
|
||||
inc(pos)
|
||||
if buf[pos] == '#': inc(pos)
|
||||
if buf[pos] != '[': return true
|
||||
else:
|
||||
break
|
||||
|
||||
proc skipMultiLineComment(L: var TLexer; tok: var TToken; start: int;
|
||||
isDoc: bool) =
|
||||
var pos = start
|
||||
@@ -996,18 +1026,27 @@ proc skip(L: var TLexer, tok: var TToken) =
|
||||
of '#':
|
||||
# do not skip documentation comment:
|
||||
if buf[pos+1] == '#': break
|
||||
when defined(nimpretty):
|
||||
tok.commentOffsetA = L.offsetBase + pos
|
||||
if buf[pos+1] == '[':
|
||||
skipMultiLineComment(L, tok, pos+2, false)
|
||||
pos = L.bufpos
|
||||
buf = L.buf
|
||||
when defined(nimpretty):
|
||||
tok.commentOffsetB = L.offsetBase + pos
|
||||
else:
|
||||
tokenBegin(tok, pos)
|
||||
while buf[pos] notin {CR, LF, nimlexbase.EndOfFile}: inc(pos)
|
||||
tokenEndIgnore(tok, pos+1)
|
||||
when defined(nimpretty):
|
||||
tok.commentOffsetB = L.offsetBase + pos + 1
|
||||
else:
|
||||
break # EndOfFile also leaves the loop
|
||||
tokenEndPrevious(tok, pos-1)
|
||||
L.bufpos = pos
|
||||
when defined(nimpretty):
|
||||
if gIndentationWidth <= 0:
|
||||
gIndentationWidth = tok.indent
|
||||
|
||||
proc rawGetTok*(L: var TLexer, tok: var TToken) =
|
||||
template atTokenEnd() {.dirty.} =
|
||||
|
||||
70
compiler/liftlocals.nim
Normal file
70
compiler/liftlocals.nim
Normal file
@@ -0,0 +1,70 @@
|
||||
#
|
||||
#
|
||||
# The Nim Compiler
|
||||
# (c) Copyright 2015 Andreas Rumpf
|
||||
#
|
||||
# See the file "copying.txt", included in this
|
||||
# distribution, for details about the copyright.
|
||||
#
|
||||
|
||||
## This module implements the '.liftLocals' pragma.
|
||||
|
||||
import
|
||||
intsets, strutils, options, ast, astalgo, msgs,
|
||||
idents, renderer, types, lowerings
|
||||
|
||||
from pragmas import getPragmaVal
|
||||
from wordrecg import wLiftLocals
|
||||
|
||||
type
|
||||
Ctx = object
|
||||
partialParam: PSym
|
||||
objType: PType
|
||||
|
||||
proc interestingVar(s: PSym): bool {.inline.} =
|
||||
result = s.kind in {skVar, skLet, skTemp, skForVar, skResult} and
|
||||
sfGlobal notin s.flags
|
||||
|
||||
proc lookupOrAdd(c: var Ctx; s: PSym; info: TLineInfo): PNode =
|
||||
let field = addUniqueField(c.objType, s)
|
||||
var deref = newNodeI(nkHiddenDeref, info)
|
||||
deref.typ = c.objType
|
||||
add(deref, newSymNode(c.partialParam, info))
|
||||
result = newNodeI(nkDotExpr, info)
|
||||
add(result, deref)
|
||||
add(result, newSymNode(field))
|
||||
result.typ = field.typ
|
||||
|
||||
proc liftLocals(n: PNode; i: int; c: var Ctx) =
|
||||
let it = n[i]
|
||||
case it.kind
|
||||
of nkSym:
|
||||
if interestingVar(it.sym):
|
||||
n[i] = lookupOrAdd(c, it.sym, it.info)
|
||||
of procDefs, nkTypeSection: discard
|
||||
else:
|
||||
for i in 0 ..< it.safeLen:
|
||||
liftLocals(it, i, c)
|
||||
|
||||
proc lookupParam(params, dest: PNode): PSym =
|
||||
if dest.kind != nkIdent: return nil
|
||||
for i in 1 ..< params.len:
|
||||
if params[i].kind == nkSym and params[i].sym.name.id == dest.ident.id:
|
||||
return params[i].sym
|
||||
|
||||
proc liftLocalsIfRequested*(prc: PSym; n: PNode): PNode =
|
||||
let liftDest = getPragmaVal(prc.ast, wLiftLocals)
|
||||
if liftDest == nil: return n
|
||||
let partialParam = lookupParam(prc.typ.n, liftDest)
|
||||
if partialParam.isNil:
|
||||
localError(liftDest.info, "'$1' is not a parameter of '$2'" %
|
||||
[$liftDest, prc.name.s])
|
||||
return n
|
||||
let objType = partialParam.typ.skipTypes(abstractPtrs)
|
||||
if objType.kind != tyObject or tfPartial notin objType.flags:
|
||||
localError(liftDest.info, "parameter '$1' is not a pointer to a partial object" % $liftDest)
|
||||
return n
|
||||
var c = Ctx(partialParam: partialParam, objType: objType)
|
||||
let w = newTree(nkStmtList, n)
|
||||
liftLocals(w, 0, c)
|
||||
result = w[0]
|
||||
@@ -39,14 +39,18 @@ proc considerQuotedIdent*(n: PNode, origin: PNode = nil): PIdent =
|
||||
of 1: result = considerQuotedIdent(n.sons[0], origin)
|
||||
else:
|
||||
var id = ""
|
||||
for i in 0.. <n.len:
|
||||
for i in 0..<n.len:
|
||||
let x = n.sons[i]
|
||||
case x.kind
|
||||
of nkIdent: id.add(x.ident.s)
|
||||
of nkSym: id.add(x.sym.name.s)
|
||||
else: handleError(n, origin)
|
||||
result = getIdent(id)
|
||||
of nkOpenSymChoice, nkClosedSymChoice: result = n.sons[0].sym.name
|
||||
of nkOpenSymChoice, nkClosedSymChoice:
|
||||
if n[0].kind == nkSym:
|
||||
result = n.sons[0].sym.name
|
||||
else:
|
||||
handleError(n, origin)
|
||||
else:
|
||||
handleError(n, origin)
|
||||
|
||||
@@ -155,7 +159,7 @@ proc ensureNoMissingOrUnusedSymbols(scope: PScope) =
|
||||
var s = initTabIter(it, scope.symbols)
|
||||
var missingImpls = 0
|
||||
while s != nil:
|
||||
if sfForward in s.flags:
|
||||
if sfForward in s.flags and s.kind != skType:
|
||||
# too many 'implementation of X' errors are annoying
|
||||
# and slow 'suggest' down:
|
||||
if missingImpls == 0:
|
||||
@@ -379,7 +383,11 @@ proc initOverloadIter*(o: var TOverloadIter, c: PContext, n: PNode): PSym =
|
||||
result = errorSym(c, n.sons[1])
|
||||
of nkClosedSymChoice, nkOpenSymChoice:
|
||||
o.mode = oimSymChoice
|
||||
result = n.sons[0].sym
|
||||
if n[0].kind == nkSym:
|
||||
result = n.sons[0].sym
|
||||
else:
|
||||
o.mode = oimDone
|
||||
return nil
|
||||
o.symChoiceIndex = 1
|
||||
o.inSymChoice = initIntSet()
|
||||
incl(o.inSymChoice, result.id)
|
||||
@@ -437,13 +445,14 @@ proc nextOverloadIter*(o: var TOverloadIter, c: PContext, n: PNode): PSym =
|
||||
|
||||
if result != nil and result.kind == skStub: loadStub(result)
|
||||
|
||||
proc pickSym*(c: PContext, n: PNode; kind: TSymKind;
|
||||
proc pickSym*(c: PContext, n: PNode; kinds: set[TSymKind];
|
||||
flags: TSymFlags = {}): PSym =
|
||||
var o: TOverloadIter
|
||||
var a = initOverloadIter(o, c, n)
|
||||
while a != nil:
|
||||
if a.kind == kind and flags <= a.flags:
|
||||
return a
|
||||
if a.kind in kinds and flags <= a.flags:
|
||||
if result == nil: result = a
|
||||
else: return nil # ambiguous
|
||||
a = nextOverloadIter(o, c, n)
|
||||
|
||||
proc isInfixAs*(n: PNode): bool =
|
||||
|
||||
@@ -182,8 +182,9 @@ proc addField*(obj: PType; s: PSym) =
|
||||
field.position = sonsLen(obj.n)
|
||||
addSon(obj.n, newSymNode(field))
|
||||
|
||||
proc addUniqueField*(obj: PType; s: PSym) =
|
||||
if lookupInRecord(obj.n, s.id) == nil:
|
||||
proc addUniqueField*(obj: PType; s: PSym): PSym {.discardable.} =
|
||||
result = lookupInRecord(obj.n, s.id)
|
||||
if result == nil:
|
||||
var field = newSym(skField, getIdent(s.name.s & $obj.n.len), s.owner, s.info)
|
||||
field.id = -s.id
|
||||
let t = skipIntLit(s.typ)
|
||||
@@ -191,6 +192,7 @@ proc addUniqueField*(obj: PType; s: PSym) =
|
||||
assert t.kind != tyStmt
|
||||
field.position = sonsLen(obj.n)
|
||||
addSon(obj.n, newSymNode(field))
|
||||
result = field
|
||||
|
||||
proc newDotExpr(obj, b: PSym): PNode =
|
||||
result = newNodeI(nkDotExpr, obj.info)
|
||||
@@ -463,7 +465,7 @@ proc setupArgsForConcurrency(n: PNode; objType: PType; scratchObj: PSym,
|
||||
varSection, varInit, result: PNode) =
|
||||
let formals = n[0].typ.n
|
||||
let tmpName = getIdent(genPrefix)
|
||||
for i in 1 .. <n.len:
|
||||
for i in 1 ..< n.len:
|
||||
# we pick n's type here, which hopefully is 'tyArray' and not
|
||||
# 'tyOpenArray':
|
||||
var argType = n[i].typ.skipTypes(abstractInst)
|
||||
@@ -519,7 +521,7 @@ proc setupArgsForParallelism(n: PNode; objType: PType; scratchObj: PSym;
|
||||
let tmpName = getIdent(genPrefix)
|
||||
# we need to copy the foreign scratch object fields into local variables
|
||||
# for correctness: These are called 'threadLocal' here.
|
||||
for i in 1 .. <n.len:
|
||||
for i in 1 ..< n.len:
|
||||
let n = n[i]
|
||||
let argType = skipTypes(if i < formals.len: formals[i].typ else: n.typ,
|
||||
abstractInst)
|
||||
|
||||
@@ -16,12 +16,12 @@ import
|
||||
cgen, jsgen, json, nversion,
|
||||
platform, nimconf, importer, passaux, depends, vm, vmdef, types, idgen,
|
||||
docgen2, service, parser, modules, ccgutils, sigmatch, ropes,
|
||||
modulegraphs
|
||||
modulegraphs, tables
|
||||
|
||||
from magicsys import systemModule, resetSysTypes
|
||||
|
||||
proc rodPass =
|
||||
if optSymbolFiles in gGlobalOptions:
|
||||
if gSymbolFiles in {enabledSf, writeOnlySf}:
|
||||
registerPass(rodwritePass)
|
||||
|
||||
proc codegenPass =
|
||||
@@ -36,6 +36,9 @@ proc writeDepsFile(g: ModuleGraph; project: string) =
|
||||
for m in g.modules:
|
||||
if m != nil:
|
||||
f.writeLine(toFullPath(m.position.int32))
|
||||
for k in g.inclToMod.keys:
|
||||
if g.getModule(k).isNil: # don't repeat includes which are also modules
|
||||
f.writeLine(k.toFullPath)
|
||||
f.close()
|
||||
|
||||
proc commandGenDepend(graph: ModuleGraph; cache: IdentCache) =
|
||||
@@ -77,6 +80,8 @@ proc commandCompileToC(graph: ModuleGraph; cache: IdentCache) =
|
||||
let proj = changeFileExt(gProjectFull, "")
|
||||
extccomp.callCCompiler(proj)
|
||||
extccomp.writeJsonBuildInstructions(proj)
|
||||
if optGenScript in gGlobalOptions:
|
||||
writeDepsFile(graph, toGeneratedFile(proj, ""))
|
||||
|
||||
proc commandJsonScript(graph: ModuleGraph; cache: IdentCache) =
|
||||
let proj = changeFileExt(gProjectFull, "")
|
||||
@@ -186,12 +191,12 @@ proc mainCommand*(graph: ModuleGraph; cache: IdentCache) =
|
||||
of "php":
|
||||
gCmd = cmdCompileToPHP
|
||||
commandCompileToJS(graph, cache)
|
||||
of "doc":
|
||||
of "doc0":
|
||||
wantMainModule()
|
||||
gCmd = cmdDoc
|
||||
loadConfigs(DocConfig, cache)
|
||||
commandDoc()
|
||||
of "doc2":
|
||||
of "doc2", "doc":
|
||||
gCmd = cmdDoc
|
||||
loadConfigs(DocConfig, cache)
|
||||
defineSymbol("nimdoc")
|
||||
@@ -217,6 +222,13 @@ proc mainCommand*(graph: ModuleGraph; cache: IdentCache) =
|
||||
wantMainModule()
|
||||
defineSymbol("nimdoc")
|
||||
commandDoc2(graph, cache, true)
|
||||
of "ctags":
|
||||
wantMainModule()
|
||||
gCmd = cmdDoc
|
||||
loadConfigs(DocConfig, cache)
|
||||
wantMainModule()
|
||||
defineSymbol("nimdoc")
|
||||
commandTags()
|
||||
of "buildindex":
|
||||
gCmd = cmdDoc
|
||||
loadConfigs(DocConfig, cache)
|
||||
|
||||
174
compiler/modulepaths.nim
Normal file
174
compiler/modulepaths.nim
Normal file
@@ -0,0 +1,174 @@
|
||||
#
|
||||
#
|
||||
# The Nim Compiler
|
||||
# (c) Copyright 2017 Contributors
|
||||
#
|
||||
# See the file "copying.txt", included in this
|
||||
# distribution, for details about the copyright.
|
||||
#
|
||||
|
||||
import ast, renderer, strutils, msgs, options, idents, os
|
||||
|
||||
import nimblecmd
|
||||
|
||||
const
|
||||
considerParentDirs = not defined(noParentProjects)
|
||||
considerNimbleDirs = not defined(noNimbleDirs)
|
||||
|
||||
proc findInNimbleDir(pkg, subdir, dir: string): string =
|
||||
var best = ""
|
||||
var bestv = ""
|
||||
for k, p in os.walkDir(dir, relative=true):
|
||||
if k == pcDir and p.len > pkg.len+1 and
|
||||
p[pkg.len] == '-' and p.startsWith(pkg):
|
||||
let (_, a) = getPathVersion(p)
|
||||
if bestv.len == 0 or bestv < a:
|
||||
bestv = a
|
||||
best = dir / p
|
||||
|
||||
if best.len > 0:
|
||||
var f: File
|
||||
if open(f, best / changeFileExt(pkg, ".nimble-link")):
|
||||
# the second line contains what we're interested in, see:
|
||||
# https://github.com/nim-lang/nimble#nimble-link
|
||||
var override = ""
|
||||
discard readLine(f, override)
|
||||
discard readLine(f, override)
|
||||
close(f)
|
||||
if not override.isAbsolute():
|
||||
best = best / override
|
||||
else:
|
||||
best = override
|
||||
let f = if subdir.len == 0: pkg else: subdir
|
||||
let res = addFileExt(best / f, "nim")
|
||||
if best.len > 0 and fileExists(res):
|
||||
result = res
|
||||
|
||||
const stdlibDirs = [
|
||||
"pure", "core", "arch",
|
||||
"pure/collections",
|
||||
"pure/concurrency", "impure",
|
||||
"wrappers", "wrappers/linenoise",
|
||||
"windows", "posix", "js"]
|
||||
|
||||
proc resolveDollar(project, source, pkg, subdir: string; info: TLineInfo): string =
|
||||
template attempt(a) =
|
||||
let x = addFileExt(a, "nim")
|
||||
if fileExists(x): return x
|
||||
|
||||
case pkg
|
||||
of "stdlib":
|
||||
if subdir.len == 0:
|
||||
return options.libpath
|
||||
else:
|
||||
for candidate in stdlibDirs:
|
||||
attempt(options.libpath / candidate / subdir)
|
||||
of "root":
|
||||
let root = project.splitFile.dir
|
||||
if subdir.len == 0:
|
||||
return root
|
||||
else:
|
||||
attempt(root / subdir)
|
||||
else:
|
||||
when considerParentDirs:
|
||||
var p = parentDir(source.splitFile.dir)
|
||||
# support 'import $karax':
|
||||
let f = if subdir.len == 0: pkg else: subdir
|
||||
|
||||
while p.len > 0:
|
||||
let dir = p / pkg
|
||||
if dirExists(dir):
|
||||
attempt(dir / f)
|
||||
# 2nd attempt: try to use 'karax/karax'
|
||||
attempt(dir / pkg / f)
|
||||
# 3rd attempt: try to use 'karax/src/karax'
|
||||
attempt(dir / "src" / f)
|
||||
attempt(dir / "src" / pkg / f)
|
||||
p = parentDir(p)
|
||||
|
||||
when considerNimbleDirs:
|
||||
if not options.gNoNimblePath:
|
||||
var nimbleDir = getEnv("NIMBLE_DIR")
|
||||
if nimbleDir.len == 0: nimbleDir = getHomeDir() / ".nimble"
|
||||
result = findInNimbleDir(pkg, subdir, nimbleDir / "pkgs")
|
||||
if result.len > 0: return result
|
||||
when not defined(windows):
|
||||
result = findInNimbleDir(pkg, subdir, "/opt/nimble/pkgs")
|
||||
if result.len > 0: return result
|
||||
|
||||
proc scriptableImport(pkg, sub: string; info: TLineInfo): string =
|
||||
result = resolveDollar(gProjectFull, info.toFullPath(), pkg, sub, info)
|
||||
if result.isNil: result = ""
|
||||
|
||||
proc lookupPackage(pkg, subdir: PNode): string =
|
||||
let sub = if subdir != nil: renderTree(subdir, {renderNoComments}).replace(" ") else: ""
|
||||
case pkg.kind
|
||||
of nkStrLit, nkRStrLit, nkTripleStrLit:
|
||||
result = scriptableImport(pkg.strVal, sub, pkg.info)
|
||||
of nkIdent:
|
||||
result = scriptableImport(pkg.ident.s, sub, pkg.info)
|
||||
else:
|
||||
localError(pkg.info, "package name must be an identifier or string literal")
|
||||
result = ""
|
||||
|
||||
proc getModuleName*(n: PNode): string =
|
||||
# This returns a short relative module name without the nim extension
|
||||
# e.g. like "system", "importer" or "somepath/module"
|
||||
# The proc won't perform any checks that the path is actually valid
|
||||
case n.kind
|
||||
of nkStrLit, nkRStrLit, nkTripleStrLit:
|
||||
try:
|
||||
result = pathSubs(n.strVal, n.info.toFullPath().splitFile().dir)
|
||||
except ValueError:
|
||||
localError(n.info, "invalid path: " & n.strVal)
|
||||
result = n.strVal
|
||||
of nkIdent:
|
||||
result = n.ident.s
|
||||
of nkSym:
|
||||
result = n.sym.name.s
|
||||
of nkInfix:
|
||||
let n0 = n[0]
|
||||
let n1 = n[1]
|
||||
if n0.kind == nkIdent and n0.ident.id == getIdent("as").id:
|
||||
# XXX hack ahead:
|
||||
n.kind = nkImportAs
|
||||
n.sons[0] = n.sons[1]
|
||||
n.sons[1] = n.sons[2]
|
||||
n.sons.setLen(2)
|
||||
return getModuleName(n.sons[0])
|
||||
if n1.kind == nkPrefix and n1[0].kind == nkIdent and n1[0].ident.s == "$":
|
||||
if n0.kind == nkIdent and n0.ident.s == "/":
|
||||
result = lookupPackage(n1[1], n[2])
|
||||
else:
|
||||
localError(n.info, "only '/' supported with $package notation")
|
||||
result = ""
|
||||
else:
|
||||
# hacky way to implement 'x / y /../ z':
|
||||
result = getModuleName(n1)
|
||||
result.add renderTree(n0, {renderNoComments})
|
||||
result.add getModuleName(n[2])
|
||||
of nkPrefix:
|
||||
if n.sons[0].kind == nkIdent and n.sons[0].ident.s == "$":
|
||||
result = lookupPackage(n[1], nil)
|
||||
else:
|
||||
# hacky way to implement 'x / y /../ z':
|
||||
result = renderTree(n, {renderNoComments}).replace(" ")
|
||||
of nkDotExpr:
|
||||
result = renderTree(n, {renderNoComments}).replace(".", "/")
|
||||
of nkImportAs:
|
||||
result = getModuleName(n.sons[0])
|
||||
else:
|
||||
localError(n.info, errGenerated, "invalid module name: '$1'" % n.renderTree)
|
||||
result = ""
|
||||
|
||||
proc checkModuleName*(n: PNode; doLocalError=true): int32 =
|
||||
# This returns the full canonical path for a given module import
|
||||
let modulename = n.getModuleName
|
||||
let fullPath = findModule(modulename, n.info.toFullPath)
|
||||
if fullPath.len == 0:
|
||||
if doLocalError:
|
||||
let m = if modulename.len > 0: modulename else: $n
|
||||
localError(n.info, errCannotOpenFile, m)
|
||||
result = InvalidFileIDX
|
||||
else:
|
||||
result = fullPath.fileInfoIdx
|
||||
@@ -26,7 +26,8 @@ type
|
||||
errAtPopWithoutPush, errEmptyAsm, errInvalidIndentation,
|
||||
errExceptionExpected, errExceptionAlreadyHandled,
|
||||
errYieldNotAllowedHere, errYieldNotAllowedInTryStmt,
|
||||
errInvalidNumberOfYieldExpr, errCannotReturnExpr, errAttemptToRedefine,
|
||||
errInvalidNumberOfYieldExpr, errCannotReturnExpr,
|
||||
errNoReturnWithReturnTypeNotAllowed, errAttemptToRedefine,
|
||||
errStmtInvalidAfterReturn, errStmtExpected, errInvalidLabel,
|
||||
errInvalidCmdLineOption, errCmdLineArgExpected, errCmdLineNoArgExpected,
|
||||
errInvalidVarSubstitution, errUnknownVar, errUnknownCcompiler,
|
||||
@@ -61,7 +62,7 @@ type
|
||||
errBaseTypeMustBeOrdinal, errInheritanceOnlyWithNonFinalObjects,
|
||||
errInheritanceOnlyWithEnums, errIllegalRecursionInTypeX,
|
||||
errCannotInstantiateX, errExprHasNoAddress, errXStackEscape,
|
||||
errVarForOutParamNeeded,
|
||||
errVarForOutParamNeededX,
|
||||
errPureTypeMismatch, errTypeMismatch, errButExpected, errButExpectedX,
|
||||
errAmbiguousCallXYZ, errWrongNumberOfArguments,
|
||||
errWrongNumberOfArgumentsInCall,
|
||||
@@ -179,8 +180,9 @@ const
|
||||
errYieldNotAllowedInTryStmt: "'yield' cannot be used within 'try' in a non-inlined iterator",
|
||||
errInvalidNumberOfYieldExpr: "invalid number of \'yield\' expressions",
|
||||
errCannotReturnExpr: "current routine cannot return an expression",
|
||||
errNoReturnWithReturnTypeNotAllowed: "routines with NoReturn pragma are not allowed to have return type",
|
||||
errAttemptToRedefine: "redefinition of \'$1\'",
|
||||
errStmtInvalidAfterReturn: "statement not allowed after \'return\', \'break\', \'raise\' or \'continue'",
|
||||
errStmtInvalidAfterReturn: "statement not allowed after \'return\', \'break\', \'raise\', \'continue\' or proc call with noreturn pragma",
|
||||
errStmtExpected: "statement expected",
|
||||
errInvalidLabel: "\'$1\' is no label",
|
||||
errInvalidCmdLineOption: "invalid command line option: \'$1\'",
|
||||
@@ -268,7 +270,7 @@ const
|
||||
errCannotInstantiateX: "cannot instantiate: \'$1\'",
|
||||
errExprHasNoAddress: "expression has no address",
|
||||
errXStackEscape: "address of '$1' may not escape its stack frame",
|
||||
errVarForOutParamNeeded: "for a \'var\' type a variable needs to be passed",
|
||||
errVarForOutParamNeededX: "for a \'var\' type a variable needs to be passed; but '$1' is immutable",
|
||||
errPureTypeMismatch: "type mismatch",
|
||||
errTypeMismatch: "type mismatch: got (",
|
||||
errButExpected: "but expected one of: ",
|
||||
@@ -500,6 +502,7 @@ type
|
||||
fileIndex*: int32
|
||||
when defined(nimpretty):
|
||||
offsetA*, offsetB*: int
|
||||
commentOffsetA*, commentOffsetB*: int
|
||||
|
||||
TErrorOutput* = enum
|
||||
eStdOut
|
||||
|
||||
@@ -28,6 +28,10 @@ proc newVersion*(ver: string): Version =
|
||||
proc isSpecial(ver: Version): bool =
|
||||
return ($ver).len > 0 and ($ver)[0] == '#'
|
||||
|
||||
proc isValidVersion(v: string): bool =
|
||||
if v.len > 0:
|
||||
if v[0] in {'#'} + Digits: return true
|
||||
|
||||
proc `<`*(ver: Version, ver2: Version): bool =
|
||||
## This is synced from Nimble's version module.
|
||||
|
||||
@@ -72,15 +76,23 @@ proc getPathVersion*(p: string): tuple[name, version: string] =
|
||||
result.name = p
|
||||
return
|
||||
|
||||
for i in sepIdx..<p.len:
|
||||
if p[i] in {DirSep, AltSep}:
|
||||
result.name = p
|
||||
return
|
||||
|
||||
result.name = p[0 .. sepIdx - 1]
|
||||
result.version = p.substr(sepIdx + 1)
|
||||
|
||||
proc addPackage(packages: StringTableRef, p: string) =
|
||||
proc addPackage(packages: StringTableRef, p: string; info: TLineInfo) =
|
||||
let (name, ver) = getPathVersion(p)
|
||||
let version = newVersion(ver)
|
||||
if packages.getOrDefault(name).newVersion < version or
|
||||
(not packages.hasKey(name)):
|
||||
packages[name] = $version
|
||||
if isValidVersion(ver):
|
||||
let version = newVersion(ver)
|
||||
if packages.getOrDefault(name).newVersion < version or
|
||||
(not packages.hasKey(name)):
|
||||
packages[name] = $version
|
||||
else:
|
||||
localError(info, "invalid package name: " & p)
|
||||
|
||||
iterator chosen(packages: StringTableRef): string =
|
||||
for key, val in pairs(packages):
|
||||
@@ -109,7 +121,7 @@ proc addPathRec(dir: string, info: TLineInfo) =
|
||||
if dir[pos] in {DirSep, AltSep}: inc(pos)
|
||||
for k,p in os.walkDir(dir):
|
||||
if k == pcDir and p[pos] != '.':
|
||||
addPackage(packages, p)
|
||||
addPackage(packages, p, info)
|
||||
for p in packages.chosen:
|
||||
addNimblePath(p, info)
|
||||
|
||||
|
||||
@@ -123,7 +123,7 @@ proc fillBaseLexer(L: var TBaseLexer, pos: int): int =
|
||||
result = pos + 1 # nothing to do
|
||||
else:
|
||||
fillBuffer(L)
|
||||
L.offsetBase += pos
|
||||
L.offsetBase += pos + 1
|
||||
L.bufpos = 0
|
||||
result = 0
|
||||
L.lineStart = result
|
||||
|
||||
@@ -48,7 +48,6 @@ type # please make sure we have under 32 options
|
||||
optGenScript, # generate a script file to compile the *.c files
|
||||
optGenMapping, # generate a mapping file
|
||||
optRun, # run the compiled project
|
||||
optSymbolFiles, # use symbol files for speeding up compilation
|
||||
optCaasEnabled # compiler-as-a-service is running
|
||||
optSkipConfigFile, # skip the general config file
|
||||
optSkipProjConfigFile, # skip the project's config file
|
||||
@@ -141,16 +140,25 @@ var
|
||||
gEvalExpr* = "" # expression for idetools --eval
|
||||
gLastCmdTime*: float # when caas is enabled, we measure each command
|
||||
gListFullPaths*: bool
|
||||
isServing*: bool = false
|
||||
gPreciseStack*: bool = false
|
||||
gNoNimblePath* = false
|
||||
gExperimentalMode*: bool
|
||||
newDestructors*: bool
|
||||
gDynlibOverrideAll*: bool
|
||||
|
||||
type
|
||||
SymbolFilesOption* = enum
|
||||
disabledSf, enabledSf, writeOnlySf, readOnlySf
|
||||
|
||||
var gSymbolFiles*: SymbolFilesOption
|
||||
|
||||
proc importantComments*(): bool {.inline.} = gCmd in {cmdDoc, cmdIdeTools}
|
||||
proc usesNativeGC*(): bool {.inline.} = gSelectedGC >= gcRefc
|
||||
template preciseStack*(): bool = gPreciseStack
|
||||
|
||||
template compilationCachePresent*: untyped =
|
||||
{optCaasEnabled, optSymbolFiles} * gGlobalOptions != {}
|
||||
gSymbolFiles in {enabledSf, writeOnlySf}
|
||||
# {optCaasEnabled, optSymbolFiles} * gGlobalOptions != {}
|
||||
|
||||
template optPreserveOrigSource*: untyped =
|
||||
optEmbedOrigSrc in gGlobalOptions
|
||||
@@ -161,6 +169,7 @@ const
|
||||
RodExt* = "rod"
|
||||
HtmlExt* = "html"
|
||||
JsonExt* = "json"
|
||||
TagsExt* = "tags"
|
||||
TexExt* = "tex"
|
||||
IniExt* = "ini"
|
||||
DefaultConfig* = "nim.cfg"
|
||||
@@ -425,7 +434,7 @@ proc inclDynlibOverride*(lib: string) =
|
||||
gDllOverrides[lib.canonDynlibName] = "true"
|
||||
|
||||
proc isDynlibOverride*(lib: string): bool =
|
||||
result = gDllOverrides.hasKey(lib.canonDynlibName)
|
||||
result = gDynlibOverrideAll or gDllOverrides.hasKey(lib.canonDynlibName)
|
||||
|
||||
proc binaryStrSearch*(x: openArray[string], y: string): int =
|
||||
var a = 0
|
||||
|
||||
@@ -122,7 +122,7 @@ proc semNodeKindConstraints*(p: PNode): PNode =
|
||||
result.strVal = newStringOfCap(10)
|
||||
result.strVal.add(chr(aqNone.ord))
|
||||
if p.len >= 2:
|
||||
for i in 1.. <p.len:
|
||||
for i in 1..<p.len:
|
||||
compileConstraints(p.sons[i], result.strVal)
|
||||
if result.strVal.len > MaxStackSize-1:
|
||||
internalError(p.info, "parameter pattern too complex")
|
||||
@@ -152,7 +152,7 @@ proc checkForSideEffects*(n: PNode): TSideEffectAnalysis =
|
||||
# indirect call: assume side effect:
|
||||
return seSideEffect
|
||||
# we need to check n[0] too: (FwithSideEffectButReturnsProcWithout)(args)
|
||||
for i in 0 .. <n.len:
|
||||
for i in 0 ..< n.len:
|
||||
let ret = checkForSideEffects(n.sons[i])
|
||||
if ret == seSideEffect: return ret
|
||||
elif ret == seUnknown and result == seNoSideEffect:
|
||||
@@ -163,7 +163,7 @@ proc checkForSideEffects*(n: PNode): TSideEffectAnalysis =
|
||||
else:
|
||||
# assume no side effect:
|
||||
result = seNoSideEffect
|
||||
for i in 0 .. <n.len:
|
||||
for i in 0 ..< n.len:
|
||||
let ret = checkForSideEffects(n.sons[i])
|
||||
if ret == seSideEffect: return ret
|
||||
elif ret == seUnknown and result == seNoSideEffect:
|
||||
|
||||
@@ -72,6 +72,7 @@ proc parseStmtPragma(p: var TParser): PNode
|
||||
proc parsePragma(p: var TParser): PNode
|
||||
proc postExprBlocks(p: var TParser, x: PNode): PNode
|
||||
proc parseExprStmt(p: var TParser): PNode
|
||||
proc parseBlock(p: var TParser): PNode
|
||||
# implementation
|
||||
|
||||
proc getTok(p: var TParser) =
|
||||
@@ -785,21 +786,58 @@ proc parseIfExpr(p: var TParser, kind: TNodeKind): PNode =
|
||||
#| 'else' colcom expr
|
||||
#| ifExpr = 'if' condExpr
|
||||
#| whenExpr = 'when' condExpr
|
||||
result = newNodeP(kind, p)
|
||||
while true:
|
||||
getTok(p) # skip `if`, `elif`
|
||||
var branch = newNodeP(nkElifExpr, p)
|
||||
when true:
|
||||
result = newNodeP(kind, p)
|
||||
while true:
|
||||
getTok(p) # skip `if`, `when`, `elif`
|
||||
var branch = newNodeP(nkElifExpr, p)
|
||||
optInd(p, branch)
|
||||
addSon(branch, parseExpr(p))
|
||||
colcom(p, branch)
|
||||
addSon(branch, parseStmt(p))
|
||||
skipComment(p, branch)
|
||||
addSon(result, branch)
|
||||
if p.tok.tokType != tkElif: break # or not sameOrNoInd(p): break
|
||||
if p.tok.tokType == tkElse: # and sameOrNoInd(p):
|
||||
var branch = newNodeP(nkElseExpr, p)
|
||||
eat(p, tkElse)
|
||||
colcom(p, branch)
|
||||
addSon(branch, parseStmt(p))
|
||||
addSon(result, branch)
|
||||
else:
|
||||
var
|
||||
b: PNode
|
||||
wasIndented = false
|
||||
result = newNodeP(kind, p)
|
||||
|
||||
getTok(p)
|
||||
let branch = newNodeP(nkElifExpr, p)
|
||||
addSon(branch, parseExpr(p))
|
||||
colcom(p, branch)
|
||||
let oldInd = p.currInd
|
||||
if realInd(p):
|
||||
p.currInd = p.tok.indent
|
||||
wasIndented = true
|
||||
echo result.info, " yes ", p.currInd
|
||||
addSon(branch, parseExpr(p))
|
||||
optInd(p, branch)
|
||||
addSon(result, branch)
|
||||
if p.tok.tokType != tkElif: break
|
||||
var branch = newNodeP(nkElseExpr, p)
|
||||
eat(p, tkElse)
|
||||
colcom(p, branch)
|
||||
addSon(branch, parseExpr(p))
|
||||
addSon(result, branch)
|
||||
result.add branch
|
||||
while sameInd(p) or not wasIndented:
|
||||
case p.tok.tokType
|
||||
of tkElif:
|
||||
b = newNodeP(nkElifExpr, p)
|
||||
getTok(p)
|
||||
optInd(p, b)
|
||||
addSon(b, parseExpr(p))
|
||||
of tkElse:
|
||||
b = newNodeP(nkElseExpr, p)
|
||||
getTok(p)
|
||||
else: break
|
||||
colcom(p, b)
|
||||
addSon(b, parseStmt(p))
|
||||
addSon(result, b)
|
||||
if b.kind == nkElseExpr: break
|
||||
if wasIndented:
|
||||
p.currInd = oldInd
|
||||
|
||||
proc parsePragma(p: var TParser): PNode =
|
||||
#| pragma = '{.' optInd (exprColonExpr comma?)* optPar ('.}' | '}')
|
||||
@@ -1041,12 +1079,14 @@ proc parseTypeDescKAux(p: var TParser, kind: TNodeKind,
|
||||
parseSymbolList(p, list)
|
||||
|
||||
proc parseExpr(p: var TParser): PNode =
|
||||
#| expr = (ifExpr
|
||||
#| expr = (blockExpr
|
||||
#| | ifExpr
|
||||
#| | whenExpr
|
||||
#| | caseExpr
|
||||
#| | tryExpr)
|
||||
#| / simpleExpr
|
||||
case p.tok.tokType:
|
||||
of tkBlock: result = parseBlock(p)
|
||||
of tkIf: result = parseIfExpr(p, nkIfExpr)
|
||||
of tkWhen: result = parseIfExpr(p, nkWhenExpr)
|
||||
of tkCase: result = parseCase(p)
|
||||
@@ -1100,13 +1140,9 @@ proc primary(p: var TParser, mode: TPrimaryMode): PNode =
|
||||
else:
|
||||
result = newNodeP(nkObjectTy, p)
|
||||
getTok(p)
|
||||
of tkGeneric, tkConcept:
|
||||
of tkConcept:
|
||||
if mode == pmTypeDef:
|
||||
let wasGeneric = p.tok.tokType == tkGeneric
|
||||
result = parseTypeClass(p)
|
||||
# hack so that it's remembered and can be marked as deprecated in
|
||||
# sem'check:
|
||||
if wasGeneric: result.flags.incl nfBase2
|
||||
else:
|
||||
parMessage(p, errInvalidToken, p.tok)
|
||||
of tkStatic:
|
||||
@@ -1480,6 +1516,7 @@ proc parseFor(p: var TParser): PNode =
|
||||
|
||||
proc parseBlock(p: var TParser): PNode =
|
||||
#| blockStmt = 'block' symbol? colcom stmt
|
||||
#| blockExpr = 'block' symbol? colcom stmt
|
||||
result = newNodeP(nkBlockStmt, p)
|
||||
getTokNoInd(p)
|
||||
if p.tok.tokType == tkColon: addSon(result, ast.emptyNode)
|
||||
@@ -1905,7 +1942,7 @@ proc parseVariable(p: var TParser): PNode =
|
||||
#| variable = (varTuple / identColonEquals) colonBody? indAndComment
|
||||
if p.tok.tokType == tkParLe: result = parseVarTuple(p)
|
||||
else: result = parseIdentColonEquals(p, {withPragma, withDot})
|
||||
result{-1} = postExprBlocks(p, result{-1})
|
||||
result[^1] = postExprBlocks(p, result[^1])
|
||||
indAndComment(p, result)
|
||||
|
||||
proc parseBind(p: var TParser, k: TNodeKind): PNode =
|
||||
@@ -2036,8 +2073,13 @@ proc parseStmt(p: var TParser): PNode =
|
||||
if a.kind != nkEmpty:
|
||||
addSon(result, a)
|
||||
else:
|
||||
parMessage(p, errExprExpected, p.tok)
|
||||
getTok(p)
|
||||
# This is done to make the new 'if' expressions work better.
|
||||
# XXX Eventually we need to be more strict here.
|
||||
if p.tok.tokType notin {tkElse, tkElif}:
|
||||
parMessage(p, errExprExpected, p.tok)
|
||||
getTok(p)
|
||||
else:
|
||||
break
|
||||
if not p.hasProgress and p.tok.tokType == tkEof: break
|
||||
else:
|
||||
# the case statement is only needed for better error messages:
|
||||
|
||||
@@ -15,9 +15,10 @@ import
|
||||
condsyms, idents, renderer, types, extccomp, math, magicsys, nversion,
|
||||
nimsets, syntaxes, times, rodread, idgen, modulegraphs, reorder
|
||||
|
||||
|
||||
type
|
||||
TPassContext* = object of RootObj # the pass's context
|
||||
fromCache*: bool # true if created by "openCached"
|
||||
rd*: PRodReader # != nil if created by "openCached"
|
||||
|
||||
PPassContext* = ref TPassContext
|
||||
|
||||
@@ -117,7 +118,7 @@ proc openPassesCached(g: ModuleGraph; a: var TPassContextArray, module: PSym,
|
||||
if not isNil(gPasses[i].openCached):
|
||||
a[i] = gPasses[i].openCached(g, module, rd)
|
||||
if a[i] != nil:
|
||||
a[i].fromCache = true
|
||||
a[i].rd = rd
|
||||
else:
|
||||
a[i] = nil
|
||||
|
||||
@@ -211,7 +212,7 @@ proc processModule*(graph: ModuleGraph; module: PSym, stream: PLLStream,
|
||||
if n.kind == nkEmpty: break
|
||||
sl.add n
|
||||
if sfReorder in module.flags:
|
||||
sl = reorder sl
|
||||
sl = reorder(graph, sl, module, cache)
|
||||
discard processTopLevelStmt(sl, a)
|
||||
break
|
||||
elif not processTopLevelStmt(n, a): break
|
||||
|
||||
@@ -63,7 +63,7 @@ proc sameTrees(a, b: PNode): bool =
|
||||
|
||||
proc inSymChoice(sc, x: PNode): bool =
|
||||
if sc.kind == nkClosedSymChoice:
|
||||
for i in 0.. <sc.len:
|
||||
for i in 0..<sc.len:
|
||||
if sc.sons[i].sym == x.sym: return true
|
||||
elif sc.kind == nkOpenSymChoice:
|
||||
# same name suffices for open sym choices!
|
||||
@@ -83,7 +83,7 @@ proc isPatternParam(c: PPatternContext, p: PNode): bool {.inline.} =
|
||||
result = p.kind == nkSym and p.sym.kind == skParam and p.sym.owner == c.owner
|
||||
|
||||
proc matchChoice(c: PPatternContext, p, n: PNode): bool =
|
||||
for i in 1 .. <p.len:
|
||||
for i in 1 ..< p.len:
|
||||
if matches(c, p.sons[i], n): return true
|
||||
|
||||
proc bindOrCheck(c: PPatternContext, param: PSym, n: PNode): bool =
|
||||
@@ -115,7 +115,7 @@ proc matchNested(c: PPatternContext, p, n: PNode, rpn: bool): bool =
|
||||
if rpn: arglist.add(n.sons[0])
|
||||
elif n.kind == nkHiddenStdConv and n.sons[1].kind == nkBracket:
|
||||
let n = n.sons[1]
|
||||
for i in 0.. <n.len:
|
||||
for i in 0..<n.len:
|
||||
if not matchStarAux(c, op, n[i], arglist, rpn): return false
|
||||
elif checkTypes(c, p.sons[2].sym, n):
|
||||
add(arglist, n)
|
||||
@@ -186,7 +186,7 @@ proc matches(c: PPatternContext, p, n: PNode): bool =
|
||||
# unpack varargs:
|
||||
let n = lastSon(n).sons[1]
|
||||
arglist = newNodeI(nkArgList, n.info, n.len)
|
||||
for i in 0.. <n.len: arglist.sons[i] = n.sons[i]
|
||||
for i in 0..<n.len: arglist.sons[i] = n.sons[i]
|
||||
else:
|
||||
arglist = newNodeI(nkArgList, n.info, sonsLen(n) - plen + 1)
|
||||
# f(1, 2, 3)
|
||||
@@ -206,7 +206,7 @@ proc matches(c: PPatternContext, p, n: PNode): bool =
|
||||
|
||||
proc matchStmtList(c: PPatternContext, p, n: PNode): PNode =
|
||||
proc matchRange(c: PPatternContext, p, n: PNode, i: int): bool =
|
||||
for j in 0 .. <p.len:
|
||||
for j in 0 ..< p.len:
|
||||
if not matches(c, p.sons[j], n.sons[i+j]):
|
||||
# we need to undo any bindings:
|
||||
if not isNil(c.mapping): c.mapping = nil
|
||||
@@ -229,7 +229,7 @@ proc matchStmtList(c: PPatternContext, p, n: PNode): PNode =
|
||||
|
||||
proc aliasAnalysisRequested(params: PNode): bool =
|
||||
if params.len >= 2:
|
||||
for i in 1 .. < params.len:
|
||||
for i in 1 ..< params.len:
|
||||
let param = params.sons[i].sym
|
||||
if whichAlias(param) != aqNone: return true
|
||||
|
||||
@@ -237,7 +237,7 @@ proc addToArgList(result, n: PNode) =
|
||||
if n.typ != nil and n.typ.kind != tyStmt:
|
||||
if n.kind != nkArgList: result.add(n)
|
||||
else:
|
||||
for i in 0 .. <n.len: result.add(n.sons[i])
|
||||
for i in 0 ..< n.len: result.add(n.sons[i])
|
||||
|
||||
proc applyRule*(c: PContext, s: PSym, n: PNode): PNode =
|
||||
## returns a tree to semcheck if the rule triggered; nil otherwise
|
||||
@@ -256,7 +256,7 @@ proc applyRule*(c: PContext, s: PSym, n: PNode): PNode =
|
||||
var args: PNode
|
||||
if requiresAA:
|
||||
args = newNodeI(nkArgList, n.info)
|
||||
for i in 1 .. < params.len:
|
||||
for i in 1 ..< params.len:
|
||||
let param = params.sons[i].sym
|
||||
let x = getLazy(ctx, param)
|
||||
# couldn't bind parameter:
|
||||
@@ -265,7 +265,7 @@ proc applyRule*(c: PContext, s: PSym, n: PNode): PNode =
|
||||
if requiresAA: addToArgList(args, x)
|
||||
# perform alias analysis here:
|
||||
if requiresAA:
|
||||
for i in 1 .. < params.len:
|
||||
for i in 1 ..< params.len:
|
||||
var rs = result.sons[i]
|
||||
let param = params.sons[i].sym
|
||||
case whichAlias(param)
|
||||
|
||||
@@ -7,8 +7,9 @@
|
||||
# distribution, for details about the copyright.
|
||||
#
|
||||
|
||||
## Plugin support for the Nim compiler. Right now they
|
||||
## need to be build with the compiler, no DLL support.
|
||||
## Plugin support for the Nim compiler. Right now plugins
|
||||
## need to be built with the compiler only: plugins using
|
||||
## DLLs or the FFI will not work.
|
||||
|
||||
import ast, semdata, idents
|
||||
|
||||
|
||||
@@ -21,17 +21,17 @@ const
|
||||
const
|
||||
procPragmas* = {FirstCallConv..LastCallConv, wImportc, wExportc, wNodecl,
|
||||
wMagic, wNosideeffect, wSideeffect, wNoreturn, wDynlib, wHeader,
|
||||
wCompilerproc, wProcVar, wDeprecated, wVarargs, wCompileTime, wMerge,
|
||||
wCompilerProc, wCore, wProcVar, wDeprecated, wVarargs, wCompileTime, wMerge,
|
||||
wBorrow, wExtern, wImportCompilerProc, wThread, wImportCpp, wImportObjC,
|
||||
wAsmNoStackFrame, wError, wDiscardable, wNoInit, wDestructor, wCodegenDecl,
|
||||
wAsmNoStackFrame, wError, wDiscardable, wNoInit, wCodegenDecl,
|
||||
wGensym, wInject, wRaises, wTags, wLocks, wDelegator, wGcSafe,
|
||||
wOverride, wConstructor, wExportNims, wUsed}
|
||||
wOverride, wConstructor, wExportNims, wUsed, wLiftLocals}
|
||||
converterPragmas* = procPragmas
|
||||
methodPragmas* = procPragmas+{wBase}-{wImportCpp}
|
||||
templatePragmas* = {wImmediate, wDeprecated, wError, wGensym, wInject, wDirty,
|
||||
wDelegator, wExportNims, wUsed}
|
||||
macroPragmas* = {FirstCallConv..LastCallConv, wImmediate, wImportc, wExportc,
|
||||
wNodecl, wMagic, wNosideeffect, wCompilerproc, wDeprecated, wExtern,
|
||||
wNodecl, wMagic, wNosideeffect, wCompilerProc, wCore, wDeprecated, wExtern,
|
||||
wImportCpp, wImportObjC, wError, wDiscardable, wGensym, wInject, wDelegator,
|
||||
wExportNims, wUsed}
|
||||
iteratorPragmas* = {FirstCallConv..LastCallConv, wNosideeffect, wSideeffect,
|
||||
@@ -52,14 +52,14 @@ const
|
||||
wDeprecated, wExtern, wThread, wImportCpp, wImportObjC, wAsmNoStackFrame,
|
||||
wRaises, wLocks, wTags, wGcSafe}
|
||||
typePragmas* = {wImportc, wExportc, wDeprecated, wMagic, wAcyclic, wNodecl,
|
||||
wPure, wHeader, wCompilerproc, wFinal, wSize, wExtern, wShallow,
|
||||
wPure, wHeader, wCompilerProc, wCore, wFinal, wSize, wExtern, wShallow,
|
||||
wImportCpp, wImportObjC, wError, wIncompleteStruct, wByCopy, wByRef,
|
||||
wInheritable, wGensym, wInject, wRequiresInit, wUnchecked, wUnion, wPacked,
|
||||
wBorrow, wGcSafe, wExportNims, wPartial, wUsed, wExplain}
|
||||
wBorrow, wGcSafe, wExportNims, wPartial, wUsed, wExplain, wPackage}
|
||||
fieldPragmas* = {wImportc, wExportc, wDeprecated, wExtern,
|
||||
wImportCpp, wImportObjC, wError, wGuard, wBitsize, wUsed}
|
||||
varPragmas* = {wImportc, wExportc, wVolatile, wRegister, wThreadVar, wNodecl,
|
||||
wMagic, wHeader, wDeprecated, wCompilerproc, wDynlib, wExtern,
|
||||
wMagic, wHeader, wDeprecated, wCompilerProc, wCore, wDynlib, wExtern,
|
||||
wImportCpp, wImportObjC, wError, wNoInit, wCompileTime, wGlobal,
|
||||
wGensym, wInject, wCodegenDecl, wGuard, wGoto, wExportNims, wUsed}
|
||||
constPragmas* = {wImportc, wExportc, wHeader, wDeprecated, wMagic, wNodecl,
|
||||
@@ -70,6 +70,14 @@ const
|
||||
wThread, wRaises, wLocks, wTags, wGcSafe}
|
||||
allRoutinePragmas* = methodPragmas + iteratorPragmas + lambdaPragmas
|
||||
|
||||
proc getPragmaVal*(procAst: PNode; name: TSpecialWord): PNode =
|
||||
let p = procAst[pragmasPos]
|
||||
if p.kind == nkEmpty: return nil
|
||||
for it in p:
|
||||
if it.kind == nkExprColonExpr and it[0].kind == nkIdent and
|
||||
it[0].ident.id == ord(name):
|
||||
return it[1]
|
||||
|
||||
proc pragma*(c: PContext, sym: PSym, n: PNode, validPragmas: TSpecialWords)
|
||||
# implementation
|
||||
|
||||
@@ -575,7 +583,7 @@ proc pragmaLockStmt(c: PContext; it: PNode) =
|
||||
if n.kind != nkBracket:
|
||||
localError(n.info, errGenerated, "locks pragma takes a list of expressions")
|
||||
else:
|
||||
for i in 0 .. <n.len:
|
||||
for i in 0 ..< n.len:
|
||||
n.sons[i] = c.semExpr(c, n.sons[i])
|
||||
|
||||
proc pragmaLocks(c: PContext, it: PNode): TLockLevel =
|
||||
@@ -618,7 +626,8 @@ proc deprecatedStmt(c: PContext; pragma: PNode) =
|
||||
for n in pragma:
|
||||
if n.kind in {nkExprColonExpr, nkExprEqExpr}:
|
||||
let dest = qualifiedLookUp(c, n[1], {checkUndeclared})
|
||||
assert dest != nil
|
||||
if dest == nil or dest.kind in routineKinds:
|
||||
localError(n.info, warnUser, "the .deprecated pragma is unreliable for routines")
|
||||
let src = considerQuotedIdent(n[0])
|
||||
let alias = newSym(skAlias, src, dest, n[0].info)
|
||||
incl(alias.flags, sfExported)
|
||||
@@ -750,10 +759,6 @@ proc singlePragma(c: PContext, sym: PSym, n: PNode, i: int,
|
||||
incl(sym.loc.flags, lfNoDecl)
|
||||
# implies nodecl, because otherwise header would not make sense
|
||||
if sym.loc.r == nil: sym.loc.r = rope(sym.name.s)
|
||||
of wDestructor:
|
||||
sym.flags.incl sfOverriden
|
||||
if sym.name.s.normalize != "destroy":
|
||||
localError(n.info, errGenerated, "destructor has to be named 'destroy'")
|
||||
of wOverride:
|
||||
sym.flags.incl sfOverriden
|
||||
of wNosideeffect:
|
||||
@@ -766,9 +771,11 @@ proc singlePragma(c: PContext, sym: PSym, n: PNode, i: int,
|
||||
of wNoreturn:
|
||||
noVal(it)
|
||||
incl(sym.flags, sfNoReturn)
|
||||
if sym.typ[0] != nil:
|
||||
localError(sym.ast[paramsPos][0].info, errNoReturnWithReturnTypeNotAllowed)
|
||||
of wDynlib:
|
||||
processDynLib(c, it, sym)
|
||||
of wCompilerproc:
|
||||
of wCompilerProc, wCore:
|
||||
noVal(it) # compilerproc may not get a string!
|
||||
cppDefine(c.graph.config, sym.name.s)
|
||||
if sfFromGeneric notin sym.flags: markCompilerProc(sym)
|
||||
@@ -799,6 +806,10 @@ proc singlePragma(c: PContext, sym: PSym, n: PNode, i: int,
|
||||
noVal(it)
|
||||
if sym.typ == nil or tfFinal in sym.typ.flags: invalidPragma(it)
|
||||
else: incl(sym.typ.flags, tfInheritable)
|
||||
of wPackage:
|
||||
noVal(it)
|
||||
if sym.typ == nil: invalidPragma(it)
|
||||
else: incl(sym.flags, sfForward)
|
||||
of wAcyclic:
|
||||
noVal(it)
|
||||
if sym.typ == nil: invalidPragma(it)
|
||||
@@ -974,6 +985,7 @@ proc singlePragma(c: PContext, sym: PSym, n: PNode, i: int,
|
||||
noVal(it)
|
||||
if sym == nil: invalidPragma(it)
|
||||
else: sym.flags.incl sfUsed
|
||||
of wLiftLocals: discard
|
||||
else: invalidPragma(it)
|
||||
else: invalidPragma(it)
|
||||
|
||||
|
||||
@@ -38,6 +38,7 @@ type
|
||||
checkAnon: bool # we're in a context that can contain sfAnon
|
||||
inPragma: int
|
||||
when defined(nimpretty):
|
||||
pendingNewlineCount: int
|
||||
origContent: string
|
||||
|
||||
|
||||
@@ -62,9 +63,31 @@ proc renderDefinitionName*(s: PSym, noQuotes = false): string =
|
||||
else:
|
||||
result = '`' & x & '`'
|
||||
|
||||
when not defined(nimpretty):
|
||||
const
|
||||
IndentWidth = 2
|
||||
longIndentWid = IndentWidth * 2
|
||||
else:
|
||||
template IndentWidth: untyped = lexer.gIndentationWidth
|
||||
template longIndentWid: untyped = IndentWidth() * 2
|
||||
|
||||
proc minmaxLine(n: PNode): (int, int) =
|
||||
case n.kind
|
||||
of nkTripleStrLit:
|
||||
result = (n.info.line.int, n.info.line.int + countLines(n.strVal))
|
||||
of nkCommentStmt:
|
||||
result = (n.info.line.int, n.info.line.int + countLines(n.comment))
|
||||
else:
|
||||
result = (n.info.line.int, n.info.line.int)
|
||||
for i in 0 ..< safeLen(n):
|
||||
let (currMin, currMax) = minmaxLine(n[i])
|
||||
if currMin < result[0]: result[0] = currMin
|
||||
if currMax > result[1]: result[1] = currMax
|
||||
|
||||
proc lineDiff(a, b: PNode): int =
|
||||
result = minmaxLine(b)[0] - minmaxLine(a)[1]
|
||||
|
||||
const
|
||||
IndentWidth = 2
|
||||
longIndentWid = 4
|
||||
MaxLineLen = 80
|
||||
LineCommentColumn = 30
|
||||
|
||||
@@ -90,7 +113,11 @@ proc addTok(g: var TSrcGen, kind: TTokType, s: string) =
|
||||
|
||||
proc addPendingNL(g: var TSrcGen) =
|
||||
if g.pendingNL >= 0:
|
||||
addTok(g, tkSpaces, "\n" & spaces(g.pendingNL))
|
||||
when defined(nimpretty):
|
||||
let newlines = repeat("\n", clamp(g.pendingNewlineCount, 1, 3))
|
||||
else:
|
||||
const newlines = "\n"
|
||||
addTok(g, tkSpaces, newlines & spaces(g.pendingNL))
|
||||
g.lineLen = g.pendingNL
|
||||
g.pendingNL = - 1
|
||||
g.pendingWhitespace = -1
|
||||
@@ -114,11 +141,17 @@ proc putNL(g: var TSrcGen) =
|
||||
|
||||
proc optNL(g: var TSrcGen, indent: int) =
|
||||
g.pendingNL = indent
|
||||
g.lineLen = indent # BUGFIX
|
||||
g.lineLen = indent
|
||||
when defined(nimpretty): g.pendingNewlineCount = 0
|
||||
|
||||
proc optNL(g: var TSrcGen) =
|
||||
optNL(g, g.indent)
|
||||
|
||||
proc optNL(g: var TSrcGen; a, b: PNode) =
|
||||
g.pendingNL = g.indent
|
||||
g.lineLen = g.indent
|
||||
when defined(nimpretty): g.pendingNewlineCount = lineDiff(a, b)
|
||||
|
||||
proc indentNL(g: var TSrcGen) =
|
||||
inc(g.indent, IndentWidth)
|
||||
g.pendingNL = g.indent
|
||||
@@ -142,8 +175,17 @@ proc put(g: var TSrcGen, kind: TTokType, s: string) =
|
||||
|
||||
proc toNimChar(c: char): string =
|
||||
case c
|
||||
of '\0': result = "\\0"
|
||||
of '\x01'..'\x1F', '\x80'..'\xFF': result = "\\x" & strutils.toHex(ord(c), 2)
|
||||
of '\0': result = "\\x00" # not "\\0" to avoid ambiguous cases like "\\012".
|
||||
of '\a': result = "\\a" # \x07
|
||||
of '\b': result = "\\b" # \x08
|
||||
of '\t': result = "\\t" # \x09
|
||||
of '\L': result = "\\L" # \x0A
|
||||
of '\v': result = "\\v" # \x0B
|
||||
of '\f': result = "\\f" # \x0C
|
||||
of '\c': result = "\\c" # \x0D
|
||||
of '\e': result = "\\e" # \x1B
|
||||
of '\x01'..'\x06', '\x0E'..'\x1A', '\x1C'..'\x1F', '\x80'..'\xFF':
|
||||
result = "\\x" & strutils.toHex(ord(c), 2)
|
||||
of '\'', '\"', '\\': result = '\\' & c
|
||||
else: result = c & ""
|
||||
|
||||
@@ -306,10 +348,14 @@ proc ulitAux(g: TSrcGen; n: PNode, x: BiggestInt, size: int): string =
|
||||
|
||||
proc atom(g: TSrcGen; n: PNode): string =
|
||||
when defined(nimpretty):
|
||||
let comment = if n.info.commentOffsetA < n.info.commentOffsetB:
|
||||
" " & substr(g.origContent, n.info.commentOffsetA, n.info.commentOffsetB)
|
||||
else:
|
||||
""
|
||||
if n.info.offsetA <= n.info.offsetB:
|
||||
# for some constructed tokens this can not be the case and we're better
|
||||
# off to not mess with the offset then.
|
||||
return substr(g.origContent, n.info.offsetA, n.info.offsetB)
|
||||
return substr(g.origContent, n.info.offsetA, n.info.offsetB) & comment
|
||||
var f: float32
|
||||
case n.kind
|
||||
of nkEmpty: result = ""
|
||||
@@ -577,12 +623,16 @@ proc gstmts(g: var TSrcGen, n: PNode, c: TContext, doIndent=true) =
|
||||
if n.kind == nkEmpty: return
|
||||
if n.kind in {nkStmtList, nkStmtListExpr, nkStmtListType}:
|
||||
if doIndent: indentNL(g)
|
||||
for i in countup(0, sonsLen(n) - 1):
|
||||
optNL(g)
|
||||
if n.sons[i].kind in {nkStmtList, nkStmtListExpr, nkStmtListType}:
|
||||
gstmts(g, n.sons[i], c, doIndent=false)
|
||||
let L = n.len
|
||||
for i in 0 .. L-1:
|
||||
if i > 0:
|
||||
optNL(g, n[i-1], n[i])
|
||||
else:
|
||||
gsub(g, n.sons[i])
|
||||
optNL(g)
|
||||
if n[i].kind in {nkStmtList, nkStmtListExpr, nkStmtListType}:
|
||||
gstmts(g, n[i], c, doIndent=false)
|
||||
else:
|
||||
gsub(g, n[i])
|
||||
gcoms(g)
|
||||
if doIndent: dedent(g)
|
||||
else:
|
||||
@@ -669,6 +719,7 @@ proc gcase(g: var TSrcGen, n: PNode) =
|
||||
var c: TContext
|
||||
initContext(c)
|
||||
var length = sonsLen(n)
|
||||
if length == 0: return
|
||||
var last = if n.sons[length-1].kind == nkElse: -2 else: -1
|
||||
if longMode(g, n, 0, last): incl(c.flags, rfLongMode)
|
||||
putWithSpace(g, tkCase, "case")
|
||||
@@ -785,7 +836,10 @@ proc gident(g: var TSrcGen, n: PNode) =
|
||||
t = tkOpr
|
||||
put(g, t, s)
|
||||
if n.kind == nkSym and (renderIds in g.flags or sfGenSym in n.sym.flags):
|
||||
put(g, tkIntLit, $n.sym.id)
|
||||
when defined(debugMagics):
|
||||
put(g, tkIntLit, $n.sym.id & $n.sym.magic)
|
||||
else:
|
||||
put(g, tkIntLit, $n.sym.id)
|
||||
|
||||
proc doParamsAux(g: var TSrcGen, params: PNode) =
|
||||
if params.len > 1:
|
||||
@@ -816,7 +870,7 @@ proc gsub(g: var TSrcGen, n: PNode, c: TContext) =
|
||||
a: TContext
|
||||
if n.comment != nil: pushCom(g, n)
|
||||
case n.kind # atoms:
|
||||
of nkTripleStrLit: putRawStr(g, tkTripleStrLit, n.strVal)
|
||||
of nkTripleStrLit: put(g, tkTripleStrLit, atom(g, n))
|
||||
of nkEmpty: discard
|
||||
of nkType: put(g, tkInvalid, atom(g, n))
|
||||
of nkSym, nkIdent: gident(g, n)
|
||||
@@ -1035,7 +1089,7 @@ proc gsub(g: var TSrcGen, n: PNode, c: TContext) =
|
||||
of nkAccQuoted:
|
||||
put(g, tkAccent, "`")
|
||||
if n.len > 0: gsub(g, n.sons[0])
|
||||
for i in 1 .. <n.len:
|
||||
for i in 1 ..< n.len:
|
||||
put(g, tkSpaces, Space)
|
||||
gsub(g, n.sons[i])
|
||||
put(g, tkAccent, "`")
|
||||
@@ -1353,8 +1407,8 @@ proc gsub(g: var TSrcGen, n: PNode, c: TContext) =
|
||||
put(g, tkBracketRi, "]")
|
||||
of nkTupleClassTy:
|
||||
put(g, tkTuple, "tuple")
|
||||
of nkMetaNode_Obsolete:
|
||||
put(g, tkParLe, "(META|")
|
||||
of nkComesFrom:
|
||||
put(g, tkParLe, "(ComesFrom|")
|
||||
gsub(g, n, 0)
|
||||
put(g, tkParRi, ")")
|
||||
of nkGotoState, nkState:
|
||||
@@ -1384,7 +1438,7 @@ proc renderTree*(n: PNode, renderFlags: TRenderFlags = {}): string =
|
||||
|
||||
proc `$`*(n: PNode): string = n.renderTree
|
||||
|
||||
proc renderModule*(n: PNode, filename: string,
|
||||
proc renderModule*(n: PNode, infile, outfile: string,
|
||||
renderFlags: TRenderFlags = {}) =
|
||||
var
|
||||
f: File
|
||||
@@ -1392,9 +1446,9 @@ proc renderModule*(n: PNode, filename: string,
|
||||
initSrcGen(g, renderFlags)
|
||||
when defined(nimpretty):
|
||||
try:
|
||||
g.origContent = readFile(filename)
|
||||
g.origContent = readFile(infile)
|
||||
except IOError:
|
||||
rawMessage(errCannotOpenFile, filename)
|
||||
rawMessage(errCannotOpenFile, infile)
|
||||
|
||||
for i in countup(0, sonsLen(n) - 1):
|
||||
gsub(g, n.sons[i])
|
||||
@@ -1406,11 +1460,11 @@ proc renderModule*(n: PNode, filename: string,
|
||||
gcoms(g)
|
||||
if optStdout in gGlobalOptions:
|
||||
write(stdout, g.buf)
|
||||
elif open(f, filename, fmWrite):
|
||||
elif open(f, outfile, fmWrite):
|
||||
write(f, g.buf)
|
||||
close(f)
|
||||
else:
|
||||
rawMessage(errCannotOpenFile, filename)
|
||||
rawMessage(errCannotOpenFile, outfile)
|
||||
|
||||
proc initTokRender*(r: var TSrcGen, n: PNode, renderFlags: TRenderFlags = {}) =
|
||||
initSrcGen(r, renderFlags)
|
||||
|
||||
@@ -1,13 +1,40 @@
|
||||
|
||||
import intsets, tables, ast, idents, renderer
|
||||
import
|
||||
intsets, ast, idents, algorithm, renderer, parser, ospaths, strutils,
|
||||
sequtils, msgs, modulegraphs, syntaxes, options, modulepaths, tables
|
||||
|
||||
const
|
||||
nfTempMark = nfTransf
|
||||
nfPermMark = nfNoRewrite
|
||||
type
|
||||
DepN = ref object
|
||||
pnode: PNode
|
||||
id, idx, lowLink: int
|
||||
onStack: bool
|
||||
kids: seq[DepN]
|
||||
hAQ, hIS, hB, hCmd: int
|
||||
when not defined(release):
|
||||
expls: seq[string]
|
||||
DepG = seq[DepN]
|
||||
|
||||
when not defined(release):
|
||||
var idNames = newTable[int, string]()
|
||||
|
||||
proc newDepN(id: int, pnode: PNode): DepN =
|
||||
new(result)
|
||||
result.id = id
|
||||
result.pnode = pnode
|
||||
result.idx = -1
|
||||
result.lowLink = -1
|
||||
result.onStack = false
|
||||
result.kids = @[]
|
||||
result.hAQ = -1
|
||||
result.hIS = -1
|
||||
result.hB = -1
|
||||
result.hCmd = -1
|
||||
when not defined(release):
|
||||
result.expls = @[]
|
||||
|
||||
proc accQuoted(n: PNode): PIdent =
|
||||
var id = ""
|
||||
for i in 0 .. <n.len:
|
||||
for i in 0 ..< n.len:
|
||||
let x = n[i]
|
||||
case x.kind
|
||||
of nkIdent: id.add(x.ident.s)
|
||||
@@ -21,10 +48,19 @@ proc addDecl(n: PNode; declares: var IntSet) =
|
||||
of nkPragmaExpr: addDecl(n[0], declares)
|
||||
of nkIdent:
|
||||
declares.incl n.ident.id
|
||||
when not defined(release):
|
||||
idNames[n.ident.id] = n.ident.s
|
||||
of nkSym:
|
||||
declares.incl n.sym.name.id
|
||||
when not defined(release):
|
||||
idNames[n.sym.name.id] = n.sym.name.s
|
||||
of nkAccQuoted:
|
||||
declares.incl accQuoted(n).id
|
||||
let a = accQuoted(n)
|
||||
declares.incl a.id
|
||||
when not defined(release):
|
||||
idNames[a.id] = a.s
|
||||
of nkEnumFieldDef:
|
||||
addDecl(n[0], declares)
|
||||
else: discard
|
||||
|
||||
proc computeDeps(n: PNode, declares, uses: var IntSet; topLevel: bool) =
|
||||
@@ -32,7 +68,7 @@ proc computeDeps(n: PNode, declares, uses: var IntSet; topLevel: bool) =
|
||||
template decl(n) =
|
||||
if topLevel: addDecl(n, declares)
|
||||
case n.kind
|
||||
of procDefs:
|
||||
of procDefs, nkMacroDef, nkTemplateDef:
|
||||
decl(n[0])
|
||||
for i in 1..bodyPos: deps(n[i])
|
||||
of nkLetSection, nkVarSection, nkUsingStmt:
|
||||
@@ -44,43 +80,358 @@ proc computeDeps(n: PNode, declares, uses: var IntSet; topLevel: bool) =
|
||||
for a in n:
|
||||
if a.len >= 3:
|
||||
decl(a[0])
|
||||
for i in 1..<a.len: deps(a[i])
|
||||
for i in 1..<a.len:
|
||||
if a[i].kind == nkEnumTy:
|
||||
# declare enum members
|
||||
for b in a[i]:
|
||||
decl(b)
|
||||
else:
|
||||
deps(a[i])
|
||||
of nkIdentDefs:
|
||||
for i in 1..<n.len: # avoid members identifiers in object definition
|
||||
deps(n[i])
|
||||
of nkIdent: uses.incl n.ident.id
|
||||
of nkSym: uses.incl n.sym.name.id
|
||||
of nkAccQuoted: uses.incl accQuoted(n).id
|
||||
of nkOpenSymChoice, nkClosedSymChoice:
|
||||
uses.incl n.sons[0].sym.name.id
|
||||
of nkStmtList, nkStmtListExpr, nkWhenStmt, nkElifBranch, nkElse:
|
||||
of nkStmtList, nkStmtListExpr, nkWhenStmt, nkElifBranch, nkElse, nkStaticStmt:
|
||||
for i in 0..<len(n): computeDeps(n[i], declares, uses, topLevel)
|
||||
of nkPragma:
|
||||
let a = n.sons[0]
|
||||
if a.kind == nkExprColonExpr and a.sons[0].kind == nkIdent and
|
||||
a.sons[0].ident.s == "pragma":
|
||||
# user defined pragma
|
||||
decl(a.sons[1])
|
||||
else:
|
||||
for i in 0..<safeLen(n): deps(n[i])
|
||||
else:
|
||||
for i in 0..<safeLen(n): deps(n[i])
|
||||
|
||||
proc visit(i: int; all, res: PNode; deps: var seq[(IntSet, IntSet)]): bool =
|
||||
let n = all[i]
|
||||
if nfTempMark in n.flags:
|
||||
# not a DAG!
|
||||
proc cleanPath(s: string): string =
|
||||
# Here paths may have the form A / B or "A/B"
|
||||
result = ""
|
||||
for c in s:
|
||||
if c != ' ' and c != '\"':
|
||||
result.add c
|
||||
|
||||
proc joinPath(parts: seq[string]): string =
|
||||
let nb = parts.len
|
||||
assert nb > 0
|
||||
if nb == 1:
|
||||
return parts[0]
|
||||
result = parts[0] / parts[1]
|
||||
for i in 2..<parts.len:
|
||||
result = result / parts[i]
|
||||
|
||||
proc getIncludePath(n: PNode, modulePath: string): string =
|
||||
let istr = n.renderTree.cleanPath
|
||||
let (pdir, _) = modulePath.splitPath
|
||||
let p = istr.split('/').joinPath.addFileExt("nim")
|
||||
result = pdir / p
|
||||
|
||||
proc hasIncludes(n:PNode): bool =
|
||||
for a in n:
|
||||
if a.kind == nkIncludeStmt:
|
||||
return true
|
||||
|
||||
proc includeModule*(graph: ModuleGraph; s: PSym, fileIdx: int32;
|
||||
cache: IdentCache): PNode {.procvar.} =
|
||||
result = syntaxes.parseFile(fileIdx, cache)
|
||||
graph.addDep(s, fileIdx)
|
||||
graph.addIncludeDep(s.position.int32, fileIdx)
|
||||
|
||||
proc expandIncludes(graph: ModuleGraph, module: PSym, n: PNode,
|
||||
modulePath: string, includedFiles: var IntSet,
|
||||
cache: IdentCache): PNode =
|
||||
# Parses includes and injects them in the current tree
|
||||
if not n.hasIncludes:
|
||||
return n
|
||||
result = newNodeI(nkStmtList, n.info)
|
||||
for a in n:
|
||||
if a.kind == nkIncludeStmt:
|
||||
for i in 0..<a.len:
|
||||
var f = checkModuleName(a.sons[i])
|
||||
if f != InvalidFileIDX:
|
||||
if containsOrIncl(includedFiles, f):
|
||||
localError(a.info, errRecursiveDependencyX, f.toFilename)
|
||||
else:
|
||||
let nn = includeModule(graph, module, f, cache)
|
||||
let nnn = expandIncludes(graph, module, nn, modulePath,
|
||||
includedFiles, cache)
|
||||
excl(includedFiles, f)
|
||||
for b in nnn:
|
||||
result.add b
|
||||
else:
|
||||
result.add a
|
||||
|
||||
proc splitSections(n: PNode): PNode =
|
||||
# Split typeSections and ConstSections into
|
||||
# sections that contain only one definition
|
||||
assert n.kind == nkStmtList
|
||||
result = newNodeI(nkStmtList, n.info)
|
||||
for a in n:
|
||||
if a.kind in {nkTypeSection, nkConstSection} and a.len > 1:
|
||||
for b in a:
|
||||
var s = newNode(a.kind)
|
||||
s.info = b.info
|
||||
s.add b
|
||||
result.add s
|
||||
else:
|
||||
result.add a
|
||||
|
||||
proc haveSameKind(dns: seq[DepN]): bool =
|
||||
# Check if all the nodes in a strongly connected
|
||||
# component have the same kind
|
||||
result = true
|
||||
let kind = dns[0].pnode.kind
|
||||
for dn in dns:
|
||||
if dn.pnode.kind != kind:
|
||||
return false
|
||||
|
||||
proc mergeSections(comps: seq[seq[DepN]], res: PNode) =
|
||||
# Merges typeSections and ConstSections when they form
|
||||
# a strong component (ex: circular type definition)
|
||||
for c in comps:
|
||||
assert c.len > 0
|
||||
if c.len == 1:
|
||||
res.add c[0].pnode
|
||||
else:
|
||||
let fstn = c[0].pnode
|
||||
let kind = fstn.kind
|
||||
# always return to the original order when we got circular dependencies
|
||||
let cs = c.sortedByIt(it.id)
|
||||
if kind in {nkTypeSection, nkConstSection} and haveSameKind(cs):
|
||||
# Circular dependency between type or const sections, we just
|
||||
# need to merge them
|
||||
var sn = newNode(kind)
|
||||
for dn in cs:
|
||||
sn.add dn.pnode.sons[0]
|
||||
res.add sn
|
||||
else:
|
||||
# Problematic circular dependency, we arrange the nodes into
|
||||
# their original relative order and make sure to re-merge
|
||||
# consecutive type and const sections
|
||||
var wmsg = "Circular dependency detected. reorder pragma may not be able to" &
|
||||
" reorder some nodes properely"
|
||||
when not defined(release):
|
||||
wmsg &= ":\n"
|
||||
for i in 0..<cs.len-1:
|
||||
for j in i..<cs.len:
|
||||
for ci in 0..<cs[i].kids.len:
|
||||
if cs[i].kids[ci].id == cs[j].id:
|
||||
wmsg &= "line " & $cs[i].pnode.info.line &
|
||||
" depends on line " & $cs[j].pnode.info.line &
|
||||
": " & cs[i].expls[ci] & "\n"
|
||||
for j in 0..<cs.len-1:
|
||||
for ci in 0..<cs[^1].kids.len:
|
||||
if cs[^1].kids[ci].id == cs[j].id:
|
||||
wmsg &= "line " & $cs[^1].pnode.info.line &
|
||||
" depends on line " & $cs[j].pnode.info.line &
|
||||
": " & cs[^1].expls[ci] & "\n"
|
||||
message(cs[0].pnode.info, warnUser, wmsg)
|
||||
|
||||
var i = 0
|
||||
while i < cs.len:
|
||||
if cs[i].pnode.kind in {nkTypeSection, nkConstSection}:
|
||||
let ckind = cs[i].pnode.kind
|
||||
var sn = newNode(ckind)
|
||||
sn.add cs[i].pnode[0]
|
||||
inc i
|
||||
while i < cs.len and cs[i].pnode.kind == ckind :
|
||||
sn.add cs[i].pnode[0]
|
||||
inc i
|
||||
res.add sn
|
||||
else:
|
||||
res.add cs[i].pnode
|
||||
inc i
|
||||
|
||||
proc hasImportStmt(n: PNode): bool =
|
||||
# Checks if the node is an import statement or
|
||||
# i it contains one
|
||||
case n.kind
|
||||
of nkImportStmt, nkFromStmt, nkImportExceptStmt:
|
||||
return true
|
||||
if nfPermMark notin n.flags:
|
||||
incl n.flags, nfTempMark
|
||||
var uses = deps[i][1]
|
||||
for j in 0..<all.len:
|
||||
if j != i:
|
||||
let declares = deps[j][0]
|
||||
of nkStmtList, nkStmtListExpr, nkWhenStmt, nkElifBranch, nkElse, nkStaticStmt:
|
||||
for a in n:
|
||||
if a.hasImportStmt:
|
||||
return true
|
||||
else:
|
||||
result = false
|
||||
|
||||
proc hasImportStmt(n: DepN): bool =
|
||||
if n.hIS < 0:
|
||||
n.hIS = ord(n.pnode.hasImportStmt)
|
||||
result = bool(n.hIS)
|
||||
|
||||
proc hasCommand(n: PNode): bool =
|
||||
# Checks if the node is a command or a call
|
||||
# or if it contains one
|
||||
case n.kind
|
||||
of nkCommand, nkCall:
|
||||
result = true
|
||||
of nkStmtList, nkStmtListExpr, nkWhenStmt, nkElifBranch, nkElse,
|
||||
nkStaticStmt, nkLetSection, nkConstSection, nkVarSection,
|
||||
nkIdentDefs:
|
||||
for a in n:
|
||||
if a.hasCommand:
|
||||
return true
|
||||
else:
|
||||
return false
|
||||
|
||||
proc hasCommand(n: DepN): bool =
|
||||
if n.hCmd < 0:
|
||||
n.hCmd = ord(n.pnode.hasCommand)
|
||||
result = bool(n.hCmd)
|
||||
|
||||
proc hasAccQuoted(n: PNode): bool =
|
||||
if n.kind == nkAccQuoted:
|
||||
return true
|
||||
for a in n:
|
||||
if hasAccQuoted(a):
|
||||
return true
|
||||
|
||||
const extandedProcDefs = procDefs + {nkMacroDef, nkTemplateDef}
|
||||
|
||||
proc hasAccQuotedDef(n: PNode): bool =
|
||||
# Checks if the node is a function, macro, template ...
|
||||
# with a quoted name or if it contains one
|
||||
case n.kind
|
||||
of extandedProcDefs:
|
||||
result = n[0].hasAccQuoted
|
||||
of nkStmtList, nkStmtListExpr, nkWhenStmt, nkElifBranch, nkElse, nkStaticStmt:
|
||||
for a in n:
|
||||
if a.hasAccQuotedDef:
|
||||
return true
|
||||
else:
|
||||
result = false
|
||||
|
||||
proc hasAccQuotedDef(n: DepN): bool =
|
||||
if n.hAQ < 0:
|
||||
n.hAQ = ord(n.pnode.hasAccQuotedDef)
|
||||
result = bool(n.hAQ)
|
||||
|
||||
proc hasBody(n: PNode): bool =
|
||||
# Checks if the node is a function, macro, template ...
|
||||
# with a body or if it contains one
|
||||
case n.kind
|
||||
of nkCommand, nkCall:
|
||||
result = true
|
||||
of extandedProcDefs:
|
||||
result = n[^1].kind == nkStmtList
|
||||
of nkStmtList, nkStmtListExpr, nkWhenStmt, nkElifBranch, nkElse, nkStaticStmt:
|
||||
for a in n:
|
||||
if a.hasBody:
|
||||
return true
|
||||
else:
|
||||
result = false
|
||||
|
||||
proc hasBody(n: DepN): bool =
|
||||
if n.hB < 0:
|
||||
n.hB = ord(n.pnode.hasBody)
|
||||
result = bool(n.hB)
|
||||
|
||||
proc intersects(s1, s2: IntSet): bool =
|
||||
for a in s1:
|
||||
if s2.contains(a):
|
||||
return true
|
||||
|
||||
proc buildGraph(n: PNode, deps: seq[(IntSet, IntSet)]): DepG =
|
||||
# Build a dependency graph
|
||||
result = newSeqOfCap[DepN](deps.len)
|
||||
for i in 0..<deps.len:
|
||||
result.add newDepN(i, n.sons[i])
|
||||
for i in 0..<deps.len:
|
||||
var ni = result[i]
|
||||
let uses = deps[i][1]
|
||||
let niHasBody = ni.hasBody
|
||||
let niHasCmd = ni.hasCommand
|
||||
for j in 0..<deps.len:
|
||||
if i == j: continue
|
||||
var nj = result[j]
|
||||
let declares = deps[j][0]
|
||||
if j < i and nj.hasCommand and niHasCmd:
|
||||
# Preserve order for commands and calls
|
||||
ni.kids.add nj
|
||||
when not defined(release):
|
||||
ni.expls.add "both have commands and one comes after the other"
|
||||
elif j < i and nj.hasImportStmt:
|
||||
# Every node that comes after an import statement must
|
||||
# depend on that import
|
||||
ni.kids.add nj
|
||||
when not defined(release):
|
||||
ni.expls.add "parent is, or contains, an import statement and child comes after it"
|
||||
elif j < i and niHasBody and nj.hasAccQuotedDef:
|
||||
# Every function, macro, template... with a body depends
|
||||
# on precedent function declarations that have quoted names.
|
||||
# That's because it is hard to detect the use of functions
|
||||
# like "[]=", "[]", "or" ... in their bodies.
|
||||
ni.kids.add nj
|
||||
when not defined(release):
|
||||
ni.expls.add "one declares a quoted identifier and the other has a body and comes after it"
|
||||
elif j < i and niHasBody and not nj.hasBody and
|
||||
intersects(deps[i][0], declares):
|
||||
# Keep function declaration before function definition
|
||||
ni.kids.add nj
|
||||
when not defined(release):
|
||||
for dep in deps[i][0]:
|
||||
if dep in declares:
|
||||
ni.expls.add "one declares \"" & idNames[dep] & "\" and the other defines it"
|
||||
else:
|
||||
for d in declares:
|
||||
if uses.contains(d):
|
||||
let oldLen = res.len
|
||||
if visit(j, all, res, deps):
|
||||
result = true
|
||||
# rollback what we did, it turned out to be a dependency that caused
|
||||
# trouble:
|
||||
for k in oldLen..<res.len:
|
||||
res.sons[k].flags = res.sons[k].flags - {nfPermMark, nfTempMark}
|
||||
if oldLen != res.len: res.sons.setLen oldLen
|
||||
break
|
||||
n.flags = n.flags + {nfPermMark} - {nfTempMark}
|
||||
res.add n
|
||||
ni.kids.add nj
|
||||
when not defined(release):
|
||||
ni.expls.add "one declares \"" & idNames[d] & "\" and the other uses it"
|
||||
|
||||
proc reorder*(n: PNode): PNode =
|
||||
proc strongConnect(v: var DepN, idx: var int, s: var seq[DepN],
|
||||
res: var seq[seq[DepN]]) =
|
||||
# Recursive part of trajan's algorithm
|
||||
v.idx = idx
|
||||
v.lowLink = idx
|
||||
inc idx
|
||||
s.add v
|
||||
v.onStack = true
|
||||
for w in v.kids.mitems:
|
||||
if w.idx < 0:
|
||||
strongConnect(w, idx, s, res)
|
||||
v.lowLink = min(v.lowLink, w.lowLink)
|
||||
elif w.onStack:
|
||||
v.lowLink = min(v.lowLink, w.idx)
|
||||
if v.lowLink == v.idx:
|
||||
var comp = newSeq[DepN]()
|
||||
while true:
|
||||
var w = s.pop
|
||||
w.onStack = false
|
||||
comp.add w
|
||||
if w.id == v.id: break
|
||||
res.add comp
|
||||
|
||||
proc getStrongComponents(g: var DepG): seq[seq[DepN]] =
|
||||
## Tarjan's algorithm. Performs a topological sort
|
||||
## and detects strongly connected components.
|
||||
result = newSeq[seq[DepN]]()
|
||||
var s = newSeq[DepN]()
|
||||
var idx = 0
|
||||
for v in g.mitems:
|
||||
if v.idx < 0:
|
||||
strongConnect(v, idx, s, result)
|
||||
|
||||
proc hasForbiddenPragma(n: PNode): bool =
|
||||
# Checks if the tree node has some pragmas that do not
|
||||
# play well with reordering, like the push/pop pragma
|
||||
for a in n:
|
||||
if a.kind == nkPragma and a[0].kind == nkIdent and
|
||||
a[0].ident.s == "push":
|
||||
return true
|
||||
|
||||
proc reorder*(graph: ModuleGraph, n: PNode, module: PSym, cache: IdentCache): PNode =
|
||||
if n.hasForbiddenPragma:
|
||||
return n
|
||||
var includedFiles = initIntSet()
|
||||
let mpath = module.fileIdx.toFullPath
|
||||
let n = expandIncludes(graph, module, n, mpath,
|
||||
includedFiles, cache).splitSections
|
||||
result = newNodeI(nkStmtList, n.info)
|
||||
var deps = newSeq[(IntSet, IntSet)](n.len)
|
||||
for i in 0..<n.len:
|
||||
@@ -88,15 +439,6 @@ proc reorder*(n: PNode): PNode =
|
||||
deps[i][1] = initIntSet()
|
||||
computeDeps(n[i], deps[i][0], deps[i][1], true)
|
||||
|
||||
for i in 0 .. n.len-1:
|
||||
discard visit(i, n, result, deps)
|
||||
for i in 0..<result.len:
|
||||
result.sons[i].flags = result.sons[i].flags - {nfTempMark, nfPermMark}
|
||||
when false:
|
||||
# reverse the result:
|
||||
let L = result.len-1
|
||||
for i in 0 .. result.len div 2:
|
||||
result.sons[i].flags = result.sons[i].flags - {nfTempMark, nfPermMark}
|
||||
result.sons[L - i].flags = result.sons[L - i].flags - {nfTempMark, nfPermMark}
|
||||
swap(result.sons[i], result.sons[L - i])
|
||||
#echo result
|
||||
var g = buildGraph(n, deps)
|
||||
let comps = getStrongComponents(g)
|
||||
mergeSections(comps, result)
|
||||
|
||||
@@ -795,7 +795,7 @@ proc getReader(moduleId: int): PRodReader =
|
||||
# the module ID! We could introduce a mapping ID->PRodReader but I'll leave
|
||||
# this for later versions if benchmarking shows the linear search causes
|
||||
# problems:
|
||||
for i in 0 .. <gMods.len:
|
||||
for i in 0 ..< gMods.len:
|
||||
result = gMods[i].rd
|
||||
if result != nil and result.moduleID == moduleId: return result
|
||||
return nil
|
||||
@@ -861,12 +861,11 @@ proc loadMethods(r: PRodReader) =
|
||||
if r.s[r.pos] == ' ': inc(r.pos)
|
||||
|
||||
proc getHash*(fileIdx: int32): SecureHash =
|
||||
internalAssert fileIdx >= 0 and fileIdx < gMods.len
|
||||
|
||||
if gMods[fileIdx].hashDone:
|
||||
if fileIdx <% gMods.len and gMods[fileIdx].hashDone:
|
||||
return gMods[fileIdx].hash
|
||||
|
||||
result = secureHashFile(fileIdx.toFullPath)
|
||||
if fileIdx >= gMods.len: setLen(gMods, fileIdx+1)
|
||||
gMods[fileIdx].hash = result
|
||||
|
||||
template growCache*(cache, pos) =
|
||||
@@ -912,7 +911,7 @@ proc checkDep(fileIdx: int32; cache: IdentCache): TReasonForRecompile =
|
||||
|
||||
proc handleSymbolFile*(module: PSym; cache: IdentCache): PRodReader =
|
||||
let fileIdx = module.fileIdx
|
||||
if optSymbolFiles notin gGlobalOptions:
|
||||
if gSymbolFiles in {disabledSf, writeOnlySf}:
|
||||
module.id = getID()
|
||||
return nil
|
||||
idgen.loadMaxIds(options.gProjectPath / options.gProjectName)
|
||||
|
||||
@@ -13,8 +13,8 @@
|
||||
|
||||
import
|
||||
intsets, os, options, strutils, nversion, ast, astalgo, msgs, platform,
|
||||
condsyms, ropes, idents, securehash, rodread, passes, importer, idgen,
|
||||
rodutils
|
||||
condsyms, ropes, idents, securehash, rodread, passes, idgen,
|
||||
rodutils, modulepaths
|
||||
|
||||
from modulegraphs import ModuleGraph
|
||||
|
||||
|
||||
@@ -73,13 +73,14 @@ proc setupVM*(module: PSym; cache: IdentCache; scriptName: string;
|
||||
cbos copyFile:
|
||||
os.copyFile(getString(a, 0), getString(a, 1))
|
||||
cbos getLastModificationTime:
|
||||
setResult(a, toSeconds(getLastModificationTime(getString(a, 0))))
|
||||
# depends on Time's implementation!
|
||||
setResult(a, int64(getLastModificationTime(getString(a, 0))))
|
||||
|
||||
cbos rawExec:
|
||||
setResult(a, osproc.execCmd getString(a, 0))
|
||||
|
||||
cbconf getEnv:
|
||||
setResult(a, os.getEnv(a.getString 0))
|
||||
setResult(a, os.getEnv(a.getString 0, a.getString 1))
|
||||
cbconf existsEnv:
|
||||
setResult(a, os.existsEnv(a.getString 0))
|
||||
cbconf dirExists:
|
||||
@@ -143,6 +144,7 @@ proc setupVM*(module: PSym; cache: IdentCache; scriptName: string;
|
||||
|
||||
proc runNimScript*(cache: IdentCache; scriptName: string;
|
||||
freshDefines=true; config: ConfigRef=nil) =
|
||||
rawMessage(hintConf, scriptName)
|
||||
passes.gIncludeFile = includeModule
|
||||
passes.gImportModule = importModule
|
||||
let graph = newModuleGraph(config)
|
||||
|
||||
@@ -12,7 +12,7 @@
|
||||
import
|
||||
ast, strutils, hashes, options, lexer, astalgo, trees, treetab,
|
||||
wordrecg, ropes, msgs, os, condsyms, idents, renderer, types, platform, math,
|
||||
magicsys, parser, nversion, nimsets, semfold, importer,
|
||||
magicsys, parser, nversion, nimsets, semfold, modulepaths, importer,
|
||||
procfind, lookups, rodread, pragmas, passes, semdata, semtypinst, sigmatch,
|
||||
intsets, transf, vmdef, vm, idgen, aliases, cgmeth, lambdalifting,
|
||||
evaltempl, patterns, parampatterns, sempass2, nimfix.pretty, semmacrosanity,
|
||||
@@ -74,7 +74,7 @@ proc fitNode(c: PContext, formal: PType, arg: PNode; info: TLineInfo): PNode =
|
||||
localError(arg.info, errExprXHasNoType,
|
||||
renderTree(arg, {renderNoComments}))
|
||||
# error correction:
|
||||
result = copyNode(arg)
|
||||
result = copyTree(arg)
|
||||
result.typ = formal
|
||||
else:
|
||||
result = indexTypesMatch(c, formal, arg.typ, arg)
|
||||
@@ -122,7 +122,7 @@ proc commonType*(x, y: PType): PType =
|
||||
if a.sons[idx].kind == tyEmpty: return y
|
||||
elif a.kind == tyTuple and b.kind == tyTuple and a.len == b.len:
|
||||
var nt: PType
|
||||
for i in 0.. <a.len:
|
||||
for i in 0..<a.len:
|
||||
let aEmpty = isEmptyContainer(a.sons[i])
|
||||
let bEmpty = isEmptyContainer(b.sons[i])
|
||||
if aEmpty != bEmpty:
|
||||
@@ -165,6 +165,19 @@ proc commonType*(x, y: PType): PType =
|
||||
result = newType(k, r.owner)
|
||||
result.addSonSkipIntLit(r)
|
||||
|
||||
proc endsInNoReturn(n: PNode): bool =
|
||||
# check if expr ends in raise exception or call of noreturn proc
|
||||
var it = n
|
||||
while it.kind in {nkStmtList, nkStmtListExpr} and it.len > 0:
|
||||
it = it.lastSon
|
||||
result = it.kind == nkRaiseStmt or
|
||||
it.kind in nkCallKinds and it[0].kind == nkSym and sfNoReturn in it[0].sym.flags
|
||||
|
||||
proc commonType*(x: PType, y: PNode): PType =
|
||||
# ignore exception raising branches in case/if expressions
|
||||
if endsInNoReturn(y): return x
|
||||
commonType(x, y.typ)
|
||||
|
||||
proc newSymS(kind: TSymKind, n: PNode, c: PContext): PSym =
|
||||
result = newSym(kind, considerQuotedIdent(n), getCurrOwner(c), n.info)
|
||||
when defined(nimsuggest):
|
||||
@@ -423,7 +436,7 @@ proc semMacroExpr(c: PContext, n, nOrig: PNode, sym: PSym,
|
||||
result = evalMacroCall(c.module, c.cache, n, nOrig, sym)
|
||||
if efNoSemCheck notin flags:
|
||||
result = semAfterMacroCall(c, n, result, sym, flags)
|
||||
result = wrapInComesFrom(nOrig.info, result)
|
||||
result = wrapInComesFrom(nOrig.info, sym, result)
|
||||
popInfoContext()
|
||||
|
||||
proc forceBool(c: PContext, n: PNode): PNode =
|
||||
@@ -488,6 +501,8 @@ proc myOpen(graph: ModuleGraph; module: PSym; cache: IdentCache): PPassContext =
|
||||
|
||||
proc myOpenCached(graph: ModuleGraph; module: PSym; rd: PRodReader): PPassContext =
|
||||
result = myOpen(graph, module, rd.cache)
|
||||
|
||||
proc replayMethodDefs(graph: ModuleGraph; rd: PRodReader) =
|
||||
for m in items(rd.methods): methodDef(graph, m, true)
|
||||
|
||||
proc isImportSystemStmt(n: PNode): bool =
|
||||
@@ -522,14 +537,18 @@ proc semStmtAndGenerateGenerics(c: PContext, n: PNode): PNode =
|
||||
else:
|
||||
result = n
|
||||
result = semStmt(c, result)
|
||||
# BUGFIX: process newly generated generics here, not at the end!
|
||||
if c.lastGenericIdx < c.generics.len:
|
||||
var a = newNodeI(nkStmtList, n.info)
|
||||
addCodeForGenerics(c, a)
|
||||
if sonsLen(a) > 0:
|
||||
# a generic has been added to `a`:
|
||||
if result.kind != nkEmpty: addSon(a, result)
|
||||
result = a
|
||||
when false:
|
||||
# Code generators are lazy now and can deal with undeclared procs, so these
|
||||
# steps are not required anymore and actually harmful for the upcoming
|
||||
# destructor support.
|
||||
# BUGFIX: process newly generated generics here, not at the end!
|
||||
if c.lastGenericIdx < c.generics.len:
|
||||
var a = newNodeI(nkStmtList, n.info)
|
||||
addCodeForGenerics(c, a)
|
||||
if sonsLen(a) > 0:
|
||||
# a generic has been added to `a`:
|
||||
if result.kind != nkEmpty: addSon(a, result)
|
||||
result = a
|
||||
result = hloStmt(c, result)
|
||||
if gCmd == cmdInteractive and not isEmptyType(result.typ):
|
||||
result = buildEchoStmt(c, result)
|
||||
@@ -566,6 +585,18 @@ proc myProcess(context: PPassContext, n: PNode): PNode =
|
||||
result = ast.emptyNode
|
||||
#if gCmd == cmdIdeTools: findSuggest(c, n)
|
||||
|
||||
proc testExamples(c: PContext) =
|
||||
let inp = toFullPath(c.module.info)
|
||||
let outp = inp.changeFileExt"" & "_examples.nim"
|
||||
renderModule(c.runnableExamples, inp, outp)
|
||||
let backend = if isDefined("js"): "js"
|
||||
elif isDefined("cpp"): "cpp"
|
||||
elif isDefined("objc"): "objc"
|
||||
else: "c"
|
||||
if os.execShellCmd("nim " & backend & " -r " & outp) != 0:
|
||||
quit "[Examples] failed"
|
||||
removeFile(outp)
|
||||
|
||||
proc myClose(graph: ModuleGraph; context: PPassContext, n: PNode): PNode =
|
||||
var c = PContext(context)
|
||||
if gCmd == cmdIdeTools and not c.suggestionsMade:
|
||||
@@ -578,7 +609,10 @@ proc myClose(graph: ModuleGraph; context: PPassContext, n: PNode): PNode =
|
||||
addCodeForGenerics(c, result)
|
||||
if c.module.ast != nil:
|
||||
result.add(c.module.ast)
|
||||
if c.rd != nil:
|
||||
replayMethodDefs(graph, c.rd)
|
||||
popOwner(c)
|
||||
popProcCon(c)
|
||||
if c.runnableExamples != nil: testExamples(c)
|
||||
|
||||
const semPass* = makePass(myOpen, myOpenCached, myProcess, myClose)
|
||||
|
||||
@@ -7,8 +7,8 @@
|
||||
# distribution, for details about the copyright.
|
||||
#
|
||||
|
||||
## This module implements lifting for assignments. Later versions of this code
|
||||
## will be able to also lift ``=deepCopy`` and ``=destroy``.
|
||||
## This module implements lifting for type-bound operations
|
||||
## (``=sink``, ``=``, ``=destroy``, ``=deepCopy``).
|
||||
|
||||
# included from sem.nim
|
||||
|
||||
@@ -22,7 +22,8 @@ type
|
||||
recurse: bool
|
||||
|
||||
proc liftBodyAux(c: var TLiftCtx; t: PType; body, x, y: PNode)
|
||||
proc liftBody(c: PContext; typ: PType; info: TLineInfo): PSym
|
||||
proc liftBody(c: PContext; typ: PType; kind: TTypeAttachedOp;
|
||||
info: TLineInfo): PSym {.discardable.}
|
||||
|
||||
proc at(a, i: PNode, elemType: PType): PNode =
|
||||
result = newNodeI(nkBracketExpr, a.info, 2)
|
||||
@@ -31,7 +32,7 @@ proc at(a, i: PNode, elemType: PType): PNode =
|
||||
result.typ = elemType
|
||||
|
||||
proc liftBodyTup(c: var TLiftCtx; t: PType; body, x, y: PNode) =
|
||||
for i in 0 .. <t.len:
|
||||
for i in 0 ..< t.len:
|
||||
let lit = lowerings.newIntLit(i)
|
||||
liftBodyAux(c, t.sons[i], body, x.at(lit, t.sons[i]), y.at(lit, t.sons[i]))
|
||||
|
||||
@@ -57,7 +58,7 @@ proc liftBodyObj(c: var TLiftCtx; n, body, x, y: PNode) =
|
||||
var access = dotField(x, n[0].sym)
|
||||
caseStmt.add(access)
|
||||
# copy the branches over, but replace the fields with the for loop body:
|
||||
for i in 1 .. <n.len:
|
||||
for i in 1 ..< n.len:
|
||||
var branch = copyTree(n[i])
|
||||
let L = branch.len
|
||||
branch.sons[L-1] = newNodeI(nkStmtList, c.info)
|
||||
@@ -97,9 +98,37 @@ proc newOpCall(op: PSym; x: PNode): PNode =
|
||||
result.add(newSymNode(op))
|
||||
result.add x
|
||||
|
||||
proc destructorCall(c: PContext; op: PSym; x: PNode): PNode =
|
||||
result = newNodeIT(nkCall, x.info, op.typ.sons[0])
|
||||
result.add(newSymNode(op))
|
||||
if newDestructors:
|
||||
result.add genAddr(c, x)
|
||||
else:
|
||||
result.add x
|
||||
|
||||
proc newDeepCopyCall(op: PSym; x, y: PNode): PNode =
|
||||
result = newAsgnStmt(x, newOpCall(op, y))
|
||||
|
||||
proc considerAsgnOrSink(c: var TLiftCtx; t: PType; body, x, y: PNode;
|
||||
field: PSym): bool =
|
||||
if tfHasAsgn in t.flags:
|
||||
var op: PSym
|
||||
if sameType(t, c.asgnForType):
|
||||
# generate recursive call:
|
||||
if c.recurse:
|
||||
op = c.fn
|
||||
else:
|
||||
c.recurse = true
|
||||
return false
|
||||
else:
|
||||
op = field
|
||||
if op == nil:
|
||||
op = liftBody(c.c, t, c.kind, c.info)
|
||||
markUsed(c.info, op, c.c.graph.usageSym)
|
||||
styleCheckUse(c.info, op)
|
||||
body.add newAsgnCall(c.c, op, x, y)
|
||||
result = true
|
||||
|
||||
proc considerOverloadedOp(c: var TLiftCtx; t: PType; body, x, y: PNode): bool =
|
||||
case c.kind
|
||||
of attachedDestructor:
|
||||
@@ -107,26 +136,12 @@ proc considerOverloadedOp(c: var TLiftCtx; t: PType; body, x, y: PNode): bool =
|
||||
if op != nil:
|
||||
markUsed(c.info, op, c.c.graph.usageSym)
|
||||
styleCheckUse(c.info, op)
|
||||
body.add newOpCall(op, x)
|
||||
body.add destructorCall(c.c, op, x)
|
||||
result = true
|
||||
of attachedAsgn:
|
||||
if tfHasAsgn in t.flags:
|
||||
var op: PSym
|
||||
if sameType(t, c.asgnForType):
|
||||
# generate recursive call:
|
||||
if c.recurse:
|
||||
op = c.fn
|
||||
else:
|
||||
c.recurse = true
|
||||
return false
|
||||
else:
|
||||
op = t.assignment
|
||||
if op == nil:
|
||||
op = liftBody(c.c, t, c.info)
|
||||
markUsed(c.info, op, c.c.graph.usageSym)
|
||||
styleCheckUse(c.info, op)
|
||||
body.add newAsgnCall(c.c, op, x, y)
|
||||
result = true
|
||||
result = considerAsgnOrSink(c, t, body, x, y, t.assignment)
|
||||
of attachedSink:
|
||||
result = considerAsgnOrSink(c, t, body, x, y, t.sink)
|
||||
of attachedDeepCopy:
|
||||
let op = t.deepCopy
|
||||
if op != nil:
|
||||
@@ -188,7 +203,7 @@ proc liftBodyAux(c: var TLiftCtx; t: PType; body, x, y: PNode) =
|
||||
tyPtr, tyString, tyRef, tyOpt:
|
||||
defaultOp(c, t, body, x, y)
|
||||
of tyArray, tySequence:
|
||||
if tfHasAsgn in t.flags:
|
||||
if {tfHasAsgn, tfUncheckedArray} * t.flags == {tfHasAsgn}:
|
||||
if t.kind == tySequence:
|
||||
# XXX add 'nil' handling here
|
||||
body.add newSeqCall(c.c, x, y)
|
||||
@@ -245,12 +260,20 @@ proc addParam(procType: PType; param: PSym) =
|
||||
addSon(procType.n, newSymNode(param))
|
||||
rawAddSon(procType, param.typ)
|
||||
|
||||
proc liftBody(c: PContext; typ: PType; info: TLineInfo): PSym =
|
||||
proc liftBody(c: PContext; typ: PType; kind: TTypeAttachedOp;
|
||||
info: TLineInfo): PSym =
|
||||
var a: TLiftCtx
|
||||
a.info = info
|
||||
a.c = c
|
||||
a.kind = kind
|
||||
let body = newNodeI(nkStmtList, info)
|
||||
result = newSym(skProc, getIdent":lifted=", typ.owner, info)
|
||||
let procname = case kind
|
||||
of attachedAsgn: getIdent"="
|
||||
of attachedSink: getIdent"=sink"
|
||||
of attachedDeepCopy: getIdent"=deepcopy"
|
||||
of attachedDestructor: getIdent"=destroy"
|
||||
|
||||
result = newSym(skProc, procname, typ.owner, info)
|
||||
a.fn = result
|
||||
a.asgnForType = typ
|
||||
|
||||
@@ -261,27 +284,54 @@ proc liftBody(c: PContext; typ: PType; info: TLineInfo): PSym =
|
||||
|
||||
result.typ = newProcType(info, typ.owner)
|
||||
result.typ.addParam dest
|
||||
result.typ.addParam src
|
||||
if kind != attachedDestructor:
|
||||
result.typ.addParam src
|
||||
|
||||
liftBodyAux(a, typ, body, newSymNode(dest).newDeref, newSymNode(src))
|
||||
# recursion is handled explicitly, do not register the type based operation
|
||||
# before 'liftBodyAux':
|
||||
case kind
|
||||
of attachedAsgn: typ.assignment = result
|
||||
of attachedSink: typ.sink = result
|
||||
of attachedDeepCopy: typ.deepCopy = result
|
||||
of attachedDestructor: typ.destructor = result
|
||||
|
||||
var n = newNodeI(nkProcDef, info, bodyPos+1)
|
||||
for i in 0 .. < n.len: n.sons[i] = emptyNode
|
||||
for i in 0 ..< n.len: n.sons[i] = emptyNode
|
||||
n.sons[namePos] = newSymNode(result)
|
||||
n.sons[paramsPos] = result.typ.n
|
||||
n.sons[bodyPos] = body
|
||||
result.ast = n
|
||||
incl result.flags, sfFromGeneric
|
||||
|
||||
# register late as recursion is handled differently
|
||||
typ.assignment = result
|
||||
#echo "Produced this ", n
|
||||
|
||||
proc getAsgnOrLiftBody(c: PContext; typ: PType; info: TLineInfo): PSym =
|
||||
let t = typ.skipTypes({tyGenericInst, tyVar, tyAlias})
|
||||
result = t.assignment
|
||||
if result.isNil:
|
||||
result = liftBody(c, t, info)
|
||||
result = liftBody(c, t, attachedAsgn, info)
|
||||
|
||||
proc overloadedAsgn(c: PContext; dest, src: PNode): PNode =
|
||||
let a = getAsgnOrLiftBody(c, dest.typ, dest.info)
|
||||
result = newAsgnCall(c, a, dest, src)
|
||||
|
||||
proc liftTypeBoundOps*(c: PContext; typ: PType; info: TLineInfo) =
|
||||
## In the semantic pass this is called in strategic places
|
||||
## to ensure we lift assignment, destructors and moves properly.
|
||||
## The later 'destroyer' pass depends on it.
|
||||
if not newDestructors or not hasDestructor(typ): return
|
||||
when false:
|
||||
# do not produce wrong liftings while we're still instantiating generics:
|
||||
# now disabled; breaks topttree.nim!
|
||||
if c.typesWithOps.len > 0: return
|
||||
let typ = typ.skipTypes({tyGenericInst, tyAlias})
|
||||
# we generate the destructor first so that other operators can depend on it:
|
||||
if typ.destructor == nil:
|
||||
liftBody(c, typ, attachedDestructor, info)
|
||||
if typ.assignment == nil:
|
||||
liftBody(c, typ, attachedAsgn, info)
|
||||
if typ.sink == nil:
|
||||
liftBody(c, typ, attachedSink, info)
|
||||
|
||||
#proc patchResolvedTypeBoundOp*(c: PContext; n: PNode): PNode =
|
||||
# if n.kind == nkCall and
|
||||
|
||||
@@ -27,7 +27,7 @@ proc sameMethodDispatcher(a, b: PSym): bool =
|
||||
# method collide[T](a: TThing, b: TUnit[T]) is instantiated and not
|
||||
# method collide[T](a: TUnit[T], b: TThing)! This means we need to
|
||||
# *instantiate* every candidate! However, we don't keep more than 2-3
|
||||
# candidated around so we cannot implement that for now. So in order
|
||||
# candidates around so we cannot implement that for now. So in order
|
||||
# to avoid subtle problems, the call remains ambiguous and needs to
|
||||
# be disambiguated by the programmer; this way the right generic is
|
||||
# instantiated.
|
||||
@@ -90,6 +90,10 @@ proc pickBestCandidate(c: PContext, headSymbol: PNode,
|
||||
if c.currentScope.symbols.counter == counterInitial or syms != nil:
|
||||
matches(c, n, orig, z)
|
||||
if z.state == csMatch:
|
||||
#if sym.name.s == "==" and (n.info ?? "temp3"):
|
||||
# echo typeToString(sym.typ)
|
||||
# writeMatches(z)
|
||||
|
||||
# little hack so that iterators are preferred over everything else:
|
||||
if sym.kind == skIterator: inc(z.exactMatches, 200)
|
||||
case best.state
|
||||
@@ -175,7 +179,7 @@ proc notFoundError*(c: PContext, n: PNode, errors: CandidateErrors) =
|
||||
add(result, ')')
|
||||
if candidates != "":
|
||||
add(result, "\n" & msgKindToString(errButExpected) & "\n" & candidates)
|
||||
localError(n.info, errGenerated, result)
|
||||
localError(n.info, errGenerated, result & "\nexpression: " & $n)
|
||||
|
||||
proc bracketNotFoundError(c: PContext; n: PNode) =
|
||||
var errors: CandidateErrors = @[]
|
||||
@@ -231,12 +235,11 @@ proc resolveOverloads(c: PContext, n, orig: PNode,
|
||||
|
||||
if nfDotField in n.flags:
|
||||
internalAssert f.kind == nkIdent and n.sonsLen >= 2
|
||||
let calleeName = newStrNode(nkStrLit, f.ident.s).withInfo(n.info)
|
||||
|
||||
# leave the op head symbol empty,
|
||||
# we are going to try multiple variants
|
||||
n.sons[0..1] = [nil, n[1], calleeName]
|
||||
orig.sons[0..1] = [nil, orig[1], calleeName]
|
||||
n.sons[0..1] = [nil, n[1], f]
|
||||
orig.sons[0..1] = [nil, orig[1], f]
|
||||
|
||||
template tryOp(x) =
|
||||
let op = newIdentNode(getIdent(x), n.info)
|
||||
@@ -251,8 +254,8 @@ proc resolveOverloads(c: PContext, n, orig: PNode,
|
||||
tryOp "."
|
||||
|
||||
elif nfDotSetter in n.flags and f.kind == nkIdent and n.len == 3:
|
||||
let calleeName = newStrNode(nkStrLit,
|
||||
f.ident.s[0..f.ident.s.len-2]).withInfo(n.info)
|
||||
# we need to strip away the trailing '=' here:
|
||||
let calleeName = newIdentNode(getIdent(f.ident.s[0..f.ident.s.len-2]), n.info)
|
||||
let callOp = newIdentNode(getIdent".=", n.info)
|
||||
n.sons[0..1] = [callOp, n[1], calleeName]
|
||||
orig.sons[0..1] = [callOp, orig[1], calleeName]
|
||||
@@ -306,7 +309,7 @@ proc instGenericConvertersArg*(c: PContext, a: PNode, x: TCandidate) =
|
||||
proc instGenericConvertersSons*(c: PContext, n: PNode, x: TCandidate) =
|
||||
assert n.kind in nkCallKinds
|
||||
if x.genericConverter:
|
||||
for i in 1 .. <n.len:
|
||||
for i in 1 ..< n.len:
|
||||
instGenericConvertersArg(c, n.sons[i], x)
|
||||
|
||||
proc indexTypesMatch(c: PContext, f, a: PType, arg: PNode): PNode =
|
||||
@@ -490,7 +493,7 @@ proc searchForBorrowProc(c: PContext, startScope: PScope, fn: PSym): PSym =
|
||||
var call = newNodeI(nkCall, fn.info)
|
||||
var hasDistinct = false
|
||||
call.add(newIdentNode(fn.name, fn.info))
|
||||
for i in 1.. <fn.typ.n.len:
|
||||
for i in 1..<fn.typ.n.len:
|
||||
let param = fn.typ.n.sons[i]
|
||||
let t = skipTypes(param.typ, abstractVar-{tyTypeDesc, tyDistinct})
|
||||
if t.kind == tyDistinct or param.typ.kind == tyDistinct: hasDistinct = true
|
||||
|
||||
@@ -37,7 +37,6 @@ type
|
||||
# in standalone ``except`` and ``finally``
|
||||
next*: PProcCon # used for stacking procedure contexts
|
||||
wasForwarded*: bool # whether the current proc has a separate header
|
||||
bracketExpr*: PNode # current bracket expression (for ^ support)
|
||||
mapping*: TIdTable
|
||||
|
||||
TMatchedConcept* = object
|
||||
@@ -70,6 +69,7 @@ type
|
||||
|
||||
TTypeAttachedOp* = enum
|
||||
attachedAsgn,
|
||||
attachedSink,
|
||||
attachedDeepCopy,
|
||||
attachedDestructor
|
||||
|
||||
@@ -131,6 +131,12 @@ type
|
||||
recursiveDep*: string
|
||||
suggestionsMade*: bool
|
||||
inTypeContext*: int
|
||||
typesWithOps*: seq[(PType, PType)] #\
|
||||
# We need to instantiate the type bound ops lazily after
|
||||
# the generic type has been constructed completely. See
|
||||
# tests/destructor/topttree.nim for an example that
|
||||
# would otherwise fail.
|
||||
runnableExamples*: PNode
|
||||
|
||||
proc makeInstPair*(s: PSym, inst: PInstantiation): TInstantiationPair =
|
||||
result.genericSym = s
|
||||
@@ -218,6 +224,7 @@ proc newContext*(graph: ModuleGraph; module: PSym; cache: IdentCache): PContext
|
||||
result.cache = cache
|
||||
result.graph = graph
|
||||
initStrTable(result.signatures)
|
||||
result.typesWithOps = @[]
|
||||
|
||||
|
||||
proc inclSym(sq: var TSymSeq, s: PSym) =
|
||||
@@ -333,7 +340,7 @@ proc makeNotType*(c: PContext, t1: PType): PType =
|
||||
|
||||
proc nMinusOne*(n: PNode): PNode =
|
||||
result = newNode(nkCall, n.info, @[
|
||||
newSymNode(getSysMagic("<", mUnaryLt)),
|
||||
newSymNode(getSysMagic("pred", mPred)),
|
||||
n])
|
||||
|
||||
# Remember to fix the procs below this one when you make changes!
|
||||
|
||||
@@ -1,186 +0,0 @@
|
||||
#
|
||||
#
|
||||
# The Nim Compiler
|
||||
# (c) Copyright 2013 Andreas Rumpf
|
||||
#
|
||||
# See the file "copying.txt", included in this
|
||||
# distribution, for details about the copyright.
|
||||
#
|
||||
|
||||
## This module implements destructors.
|
||||
|
||||
# included from sem.nim
|
||||
|
||||
# special marker values that indicates that we are
|
||||
# 1) AnalyzingDestructor: currently analyzing the type for destructor
|
||||
# generation (needed for recursive types)
|
||||
# 2) DestructorIsTrivial: completed the analysis before and determined
|
||||
# that the type has a trivial destructor
|
||||
var analyzingDestructor, destructorIsTrivial: PSym
|
||||
new(analyzingDestructor)
|
||||
new(destructorIsTrivial)
|
||||
|
||||
var
|
||||
destructorName = getIdent"destroy_"
|
||||
destructorParam = getIdent"this_"
|
||||
destructorPragma = newIdentNode(getIdent"destructor", unknownLineInfo())
|
||||
|
||||
proc instantiateDestructor(c: PContext, typ: PType): PType
|
||||
|
||||
proc doDestructorStuff(c: PContext, s: PSym, n: PNode) =
|
||||
var t = s.typ.sons[1].skipTypes({tyVar})
|
||||
if t.kind == tyGenericInvocation:
|
||||
for i in 1 .. <t.sonsLen:
|
||||
if t.sons[i].kind != tyGenericParam:
|
||||
localError(n.info, errDestructorNotGenericEnough)
|
||||
return
|
||||
t = t.base
|
||||
elif t.kind == tyCompositeTypeClass:
|
||||
t = t.base
|
||||
if t.kind != tyGenericBody:
|
||||
localError(n.info, errDestructorNotGenericEnough)
|
||||
return
|
||||
|
||||
t.destructor = s
|
||||
# automatically insert calls to base classes' destructors
|
||||
if n.sons[bodyPos].kind != nkEmpty:
|
||||
for i in countup(0, t.sonsLen - 1):
|
||||
# when inheriting directly from object
|
||||
# there will be a single nil son
|
||||
if t.sons[i] == nil: continue
|
||||
let destructableT = instantiateDestructor(c, t.sons[i])
|
||||
if destructableT != nil:
|
||||
n.sons[bodyPos].addSon(newNode(nkCall, t.sym.info, @[
|
||||
useSym(destructableT.destructor, c.graph.usageSym),
|
||||
n.sons[paramsPos][1][0]]))
|
||||
|
||||
proc destroyFieldOrFields(c: PContext, field: PNode, holder: PNode): PNode
|
||||
|
||||
proc destroySym(c: PContext, field: PSym, holder: PNode): PNode =
|
||||
let destructableT = instantiateDestructor(c, field.typ)
|
||||
if destructableT != nil:
|
||||
result = newNode(nkCall, field.info, @[
|
||||
useSym(destructableT.destructor, c.graph.usageSym),
|
||||
newNode(nkDotExpr, field.info, @[holder, useSym(field, c.graph.usageSym)])])
|
||||
|
||||
proc destroyCase(c: PContext, n: PNode, holder: PNode): PNode =
|
||||
var nonTrivialFields = 0
|
||||
result = newNode(nkCaseStmt, n.info, @[])
|
||||
# case x.kind
|
||||
result.addSon(newNode(nkDotExpr, n.info, @[holder, n.sons[0]]))
|
||||
for i in countup(1, n.len - 1):
|
||||
# of A, B:
|
||||
let ni = n[i]
|
||||
var caseBranch = newNode(ni.kind, ni.info, ni.sons[0..ni.len-2])
|
||||
|
||||
let stmt = destroyFieldOrFields(c, ni.lastSon, holder)
|
||||
if stmt == nil:
|
||||
caseBranch.addSon(newNode(nkStmtList, ni.info, @[]))
|
||||
else:
|
||||
caseBranch.addSon(stmt)
|
||||
nonTrivialFields += stmt.len
|
||||
|
||||
result.addSon(caseBranch)
|
||||
|
||||
# maybe no fields were destroyed?
|
||||
if nonTrivialFields == 0:
|
||||
result = nil
|
||||
|
||||
proc destroyFieldOrFields(c: PContext, field: PNode, holder: PNode): PNode =
|
||||
template maybeAddLine(e) =
|
||||
let stmt = e
|
||||
if stmt != nil:
|
||||
if result == nil: result = newNode(nkStmtList)
|
||||
result.addSon(stmt)
|
||||
|
||||
case field.kind
|
||||
of nkRecCase:
|
||||
maybeAddLine destroyCase(c, field, holder)
|
||||
of nkSym:
|
||||
maybeAddLine destroySym(c, field.sym, holder)
|
||||
of nkRecList:
|
||||
for son in field:
|
||||
maybeAddLine destroyFieldOrFields(c, son, holder)
|
||||
else:
|
||||
internalAssert false
|
||||
|
||||
proc generateDestructor(c: PContext, t: PType): PNode =
|
||||
## generate a destructor for a user-defined object or tuple type
|
||||
## returns nil if the destructor turns out to be trivial
|
||||
|
||||
# XXX: This may be true for some C-imported types such as
|
||||
# Tposix_spawnattr
|
||||
if t.n == nil or t.n.sons == nil: return
|
||||
internalAssert t.n.kind == nkRecList
|
||||
let destructedObj = newIdentNode(destructorParam, unknownLineInfo())
|
||||
# call the destructods of all fields
|
||||
result = destroyFieldOrFields(c, t.n, destructedObj)
|
||||
# base classes' destructors will be automatically called by
|
||||
# semProcAux for both auto-generated and user-defined destructors
|
||||
|
||||
proc instantiateDestructor(c: PContext, typ: PType): PType =
|
||||
# returns nil if a variable of type `typ` doesn't require a
|
||||
# destructor. Otherwise, returns the type, which holds the
|
||||
# destructor that must be used for the varialbe.
|
||||
# The destructor is either user-defined or automatically
|
||||
# generated by the compiler in a member-wise fashion.
|
||||
var t = typ.skipGenericAlias
|
||||
let typeHoldingUserDefinition = if t.kind == tyGenericInst: t.base else: t
|
||||
|
||||
if typeHoldingUserDefinition.destructor != nil:
|
||||
# XXX: This is not entirely correct for recursive types, but we need
|
||||
# it temporarily to hide the "destroy is already defined" problem
|
||||
if typeHoldingUserDefinition.destructor notin
|
||||
[analyzingDestructor, destructorIsTrivial]:
|
||||
return typeHoldingUserDefinition
|
||||
else:
|
||||
return nil
|
||||
|
||||
t = t.skipTypes({tyGenericInst, tyAlias})
|
||||
case t.kind
|
||||
of tySequence, tyArray, tyOpenArray, tyVarargs:
|
||||
t.destructor = analyzingDestructor
|
||||
if instantiateDestructor(c, t.sons[0]) != nil:
|
||||
t.destructor = getCompilerProc"nimDestroyRange"
|
||||
return t
|
||||
else:
|
||||
return nil
|
||||
of tyTuple, tyObject:
|
||||
t.destructor = analyzingDestructor
|
||||
let generated = generateDestructor(c, t)
|
||||
if generated != nil:
|
||||
internalAssert t.sym != nil
|
||||
var i = t.sym.info
|
||||
let fullDef = newNode(nkProcDef, i, @[
|
||||
newIdentNode(destructorName, i),
|
||||
emptyNode,
|
||||
emptyNode,
|
||||
newNode(nkFormalParams, i, @[
|
||||
emptyNode,
|
||||
newNode(nkIdentDefs, i, @[
|
||||
newIdentNode(destructorParam, i),
|
||||
symNodeFromType(c, makeVarType(c, t), t.sym.info),
|
||||
emptyNode]),
|
||||
]),
|
||||
newNode(nkPragma, i, @[destructorPragma]),
|
||||
emptyNode,
|
||||
generated
|
||||
])
|
||||
let semantizedDef = semProc(c, fullDef)
|
||||
t.destructor = semantizedDef[namePos].sym
|
||||
return t
|
||||
else:
|
||||
t.destructor = destructorIsTrivial
|
||||
return nil
|
||||
else:
|
||||
return nil
|
||||
|
||||
proc createDestructorCall(c: PContext, s: PSym): PNode =
|
||||
let varTyp = s.typ
|
||||
if varTyp == nil or sfGlobal in s.flags: return
|
||||
let destructableT = instantiateDestructor(c, varTyp)
|
||||
if destructableT != nil:
|
||||
let call = semStmt(c, newNode(nkCall, s.info, @[
|
||||
useSym(destructableT.destructor, c.graph.usageSym),
|
||||
useSym(s, c.graph.usageSym)]))
|
||||
result = newNode(nkDefer, s.info, @[call])
|
||||
@@ -53,7 +53,6 @@ proc semExprWithType(c: PContext, n: PNode, flags: TExprFlags = {}): PNode =
|
||||
else:
|
||||
if efNoProcvarCheck notin flags: semProcvarCheck(c, result)
|
||||
if result.typ.kind == tyVar: result = newDeref(result)
|
||||
semDestructorCheck(c, result, flags)
|
||||
|
||||
proc semExprNoDeref(c: PContext, n: PNode, flags: TExprFlags = {}): PNode =
|
||||
result = semExpr(c, n, flags)
|
||||
@@ -66,7 +65,6 @@ proc semExprNoDeref(c: PContext, n: PNode, flags: TExprFlags = {}): PNode =
|
||||
result.typ = errorType(c)
|
||||
else:
|
||||
semProcvarCheck(c, result)
|
||||
semDestructorCheck(c, result, flags)
|
||||
|
||||
proc semSymGenericInstantiation(c: PContext, n: PNode, s: PSym): PNode =
|
||||
result = symChoice(c, n, s, scClosed)
|
||||
@@ -436,12 +434,12 @@ proc semArrayConstr(c: PContext, n: PNode, flags: TExprFlags): PNode =
|
||||
#addSon(result, fitNode(c, typ, n.sons[i]))
|
||||
inc(lastIndex)
|
||||
addSonSkipIntLit(result.typ, typ)
|
||||
for i in 0 .. <result.len:
|
||||
for i in 0 ..< result.len:
|
||||
result.sons[i] = fitNode(c, typ, result.sons[i], result.sons[i].info)
|
||||
result.typ.sons[0] = makeRangeType(c, 0, sonsLen(result) - 1, n.info)
|
||||
|
||||
proc fixAbstractType(c: PContext, n: PNode) =
|
||||
for i in 1 .. < n.len:
|
||||
for i in 1 ..< n.len:
|
||||
let it = n.sons[i]
|
||||
# do not get rid of nkHiddenSubConv for OpenArrays, the codegen needs it:
|
||||
if it.kind == nkHiddenSubConv and
|
||||
@@ -465,7 +463,7 @@ proc newHiddenAddrTaken(c: PContext, n: PNode): PNode =
|
||||
result = newNodeIT(nkHiddenAddr, n.info, makeVarType(c, n.typ))
|
||||
addSon(result, n)
|
||||
if isAssignable(c, n) notin {arLValue, arLocalLValue}:
|
||||
localError(n.info, errVarForOutParamNeeded)
|
||||
localError(n.info, errVarForOutParamNeededX, $n)
|
||||
|
||||
proc analyseIfAddressTaken(c: PContext, n: PNode): PNode =
|
||||
result = n
|
||||
@@ -509,9 +507,10 @@ proc analyseIfAddressTakenInCall(c: PContext, n: PNode) =
|
||||
for i in countup(1, sonsLen(n) - 1):
|
||||
if i < sonsLen(t) and t.sons[i] != nil and
|
||||
skipTypes(t.sons[i], abstractInst-{tyTypeDesc}).kind == tyVar:
|
||||
if isAssignable(c, n.sons[i]) notin {arLValue, arLocalLValue}:
|
||||
if n.sons[i].kind != nkHiddenAddr:
|
||||
localError(n.sons[i].info, errVarForOutParamNeeded)
|
||||
let it = n[i]
|
||||
if isAssignable(c, it) notin {arLValue, arLocalLValue}:
|
||||
if it.kind != nkHiddenAddr:
|
||||
localError(it.info, errVarForOutParamNeededX, $it)
|
||||
return
|
||||
for i in countup(1, sonsLen(n) - 1):
|
||||
if n.sons[i].kind == nkHiddenCallConv:
|
||||
@@ -539,7 +538,7 @@ proc evalAtCompileTime(c: PContext, n: PNode): PNode =
|
||||
var call = newNodeIT(nkCall, n.info, n.typ)
|
||||
call.add(n.sons[0])
|
||||
var allConst = true
|
||||
for i in 1 .. < n.len:
|
||||
for i in 1 ..< n.len:
|
||||
var a = getConstExpr(c.module, n.sons[i])
|
||||
if a == nil:
|
||||
allConst = false
|
||||
@@ -550,7 +549,6 @@ proc evalAtCompileTime(c: PContext, n: PNode): PNode =
|
||||
result = semfold.getConstExpr(c.module, call)
|
||||
if result.isNil: result = n
|
||||
else: return result
|
||||
result.typ = semfold.getIntervalType(callee.magic, call)
|
||||
|
||||
block maybeLabelAsStatic:
|
||||
# XXX: temporary work-around needed for tlateboundstatic.
|
||||
@@ -558,7 +556,7 @@ proc evalAtCompileTime(c: PContext, n: PNode): PNode =
|
||||
# done until we have a more robust infrastructure for
|
||||
# implicit statics.
|
||||
if n.len > 1:
|
||||
for i in 1 .. <n.len:
|
||||
for i in 1 ..< n.len:
|
||||
# see bug #2113, it's possible that n[i].typ for errornous code:
|
||||
if n[i].typ.isNil or n[i].typ.kind != tyStatic or
|
||||
tfUnresolved notin n[i].typ.flags:
|
||||
@@ -580,7 +578,7 @@ proc evalAtCompileTime(c: PContext, n: PNode): PNode =
|
||||
|
||||
var call = newNodeIT(nkCall, n.info, n.typ)
|
||||
call.add(n.sons[0])
|
||||
for i in 1 .. < n.len:
|
||||
for i in 1 ..< n.len:
|
||||
let a = getConstExpr(c.module, n.sons[i])
|
||||
if a == nil: return n
|
||||
call.add(a)
|
||||
@@ -654,7 +652,7 @@ proc bracketedMacro(n: PNode): PSym =
|
||||
result = nil
|
||||
|
||||
proc setGenericParams(c: PContext, n: PNode) =
|
||||
for i in 1 .. <n.len:
|
||||
for i in 1 ..< n.len:
|
||||
n[i].typ = semTypeNode(c, n[i], nil)
|
||||
|
||||
proc afterCallActions(c: PContext; n, orig: PNode, flags: TExprFlags): PNode =
|
||||
@@ -670,6 +668,8 @@ proc afterCallActions(c: PContext; n, orig: PNode, flags: TExprFlags): PNode =
|
||||
analyseIfAddressTakenInCall(c, result)
|
||||
if callee.magic != mNone:
|
||||
result = magicsAfterOverloadResolution(c, result, flags)
|
||||
if result.typ != nil: liftTypeBoundOps(c, result.typ, n.info)
|
||||
#result = patchResolvedTypeBoundOp(c, result)
|
||||
if c.matchedConcept == nil:
|
||||
result = evalAtCompileTime(c, result)
|
||||
|
||||
@@ -780,6 +780,19 @@ proc buildEchoStmt(c: PContext, n: PNode): PNode =
|
||||
|
||||
proc semExprNoType(c: PContext, n: PNode): PNode =
|
||||
result = semExpr(c, n, {efWantStmt})
|
||||
# make an 'if' expression an 'if' statement again for backwards
|
||||
# compatibility (.discardable was a bad idea!); bug #6980
|
||||
var isStmt = false
|
||||
if result.kind == nkIfExpr:
|
||||
isStmt = true
|
||||
for condActionPair in result:
|
||||
let action = condActionPair.lastSon
|
||||
if not implicitlyDiscardable(action) and not
|
||||
endsInNoReturn(action):
|
||||
isStmt = false
|
||||
if isStmt:
|
||||
result.kind = nkIfStmt
|
||||
result.typ = nil
|
||||
discardCheck(c, result)
|
||||
|
||||
proc isTypeExpr(n: PNode): bool =
|
||||
@@ -1188,7 +1201,6 @@ proc semSubscript(c: PContext, n: PNode, flags: TExprFlags): PNode =
|
||||
tyCString:
|
||||
if n.len != 2: return nil
|
||||
n.sons[0] = makeDeref(n.sons[0])
|
||||
c.p.bracketExpr = n.sons[0]
|
||||
for i in countup(1, sonsLen(n) - 1):
|
||||
n.sons[i] = semExprWithType(c, n.sons[i],
|
||||
flags*{efInTypeof, efDetermineType})
|
||||
@@ -1209,7 +1221,6 @@ proc semSubscript(c: PContext, n: PNode, flags: TExprFlags): PNode =
|
||||
of tyTuple:
|
||||
if n.len != 2: return nil
|
||||
n.sons[0] = makeDeref(n.sons[0])
|
||||
c.p.bracketExpr = n.sons[0]
|
||||
# [] operator for tuples requires constant expression:
|
||||
n.sons[1] = semConstExpr(c, n.sons[1])
|
||||
if skipTypes(n.sons[1].typ, {tyGenericInst, tyRange, tyOrdinal, tyAlias}).kind in
|
||||
@@ -1247,17 +1258,13 @@ proc semSubscript(c: PContext, n: PNode, flags: TExprFlags): PNode =
|
||||
of skType:
|
||||
result = symNodeFromType(c, semTypeNode(c, n, nil), n.info)
|
||||
else:
|
||||
c.p.bracketExpr = n.sons[0]
|
||||
else:
|
||||
c.p.bracketExpr = n.sons[0]
|
||||
discard
|
||||
|
||||
proc semArrayAccess(c: PContext, n: PNode, flags: TExprFlags): PNode =
|
||||
let oldBracketExpr = c.p.bracketExpr
|
||||
result = semSubscript(c, n, flags)
|
||||
if result == nil:
|
||||
# overloaded [] operator:
|
||||
result = semExpr(c, buildOverloadedSubscripts(n, getIdent"[]"))
|
||||
c.p.bracketExpr = oldBracketExpr
|
||||
|
||||
proc propertyWriteAccess(c: PContext, n, nOrig, a: PNode): PNode =
|
||||
var id = considerQuotedIdent(a[1], a)
|
||||
@@ -1296,13 +1303,10 @@ proc takeImplicitAddr(c: PContext, n: PNode): PNode =
|
||||
proc asgnToResultVar(c: PContext, n, le, ri: PNode) {.inline.} =
|
||||
if le.kind == nkHiddenDeref:
|
||||
var x = le.sons[0]
|
||||
if x.typ.kind == tyVar and x.kind == nkSym:
|
||||
if x.sym.kind == skResult:
|
||||
n.sons[0] = x # 'result[]' --> 'result'
|
||||
n.sons[1] = takeImplicitAddr(c, ri)
|
||||
if x.sym.kind != skParam:
|
||||
# XXX This is hacky. See bug #4910.
|
||||
x.typ.flags.incl tfVarIsPtr
|
||||
if x.typ.kind == tyVar and x.kind == nkSym and x.sym.kind == skResult:
|
||||
n.sons[0] = x # 'result[]' --> 'result'
|
||||
n.sons[1] = takeImplicitAddr(c, ri)
|
||||
x.typ.flags.incl tfVarIsPtr
|
||||
#echo x.info, " setting it for this type ", typeToString(x.typ), " ", n.info
|
||||
|
||||
template resultTypeIsInferrable(typ: PType): untyped =
|
||||
@@ -1329,7 +1333,6 @@ proc semAsgn(c: PContext, n: PNode; mode=asgnNormal): PNode =
|
||||
of nkBracketExpr:
|
||||
# a[i] = x
|
||||
# --> `[]=`(a, i, x)
|
||||
let oldBracketExpr = c.p.bracketExpr
|
||||
a = semSubscript(c, a, {efLValue})
|
||||
if a == nil:
|
||||
result = buildOverloadedSubscripts(n.sons[0], getIdent"[]=")
|
||||
@@ -1339,9 +1342,7 @@ proc semAsgn(c: PContext, n: PNode; mode=asgnNormal): PNode =
|
||||
return n
|
||||
else:
|
||||
result = semExprNoType(c, result)
|
||||
c.p.bracketExpr = oldBracketExpr
|
||||
return result
|
||||
c.p.bracketExpr = oldBracketExpr
|
||||
of nkCurlyExpr:
|
||||
# a{i} = x --> `{}=`(a, i, x)
|
||||
result = buildOverloadedSubscripts(n.sons[0], getIdent"{}=")
|
||||
@@ -1377,19 +1378,24 @@ proc semAsgn(c: PContext, n: PNode; mode=asgnNormal): PNode =
|
||||
if lhsIsResult:
|
||||
n.typ = enforceVoidContext
|
||||
if c.p.owner.kind != skMacro and resultTypeIsInferrable(lhs.sym.typ):
|
||||
if cmpTypes(c, lhs.typ, rhs.typ) == isGeneric:
|
||||
var rhsTyp = rhs.typ
|
||||
if rhsTyp.kind in tyUserTypeClasses and rhsTyp.isResolvedUserTypeClass:
|
||||
rhsTyp = rhsTyp.lastSon
|
||||
if cmpTypes(c, lhs.typ, rhsTyp) in {isGeneric, isEqual}:
|
||||
internalAssert c.p.resultSym != nil
|
||||
lhs.typ = rhs.typ
|
||||
c.p.resultSym.typ = rhs.typ
|
||||
c.p.owner.typ.sons[0] = rhs.typ
|
||||
lhs.typ = rhsTyp
|
||||
c.p.resultSym.typ = rhsTyp
|
||||
c.p.owner.typ.sons[0] = rhsTyp
|
||||
else:
|
||||
typeMismatch(n.info, lhs.typ, rhs.typ)
|
||||
typeMismatch(n.info, lhs.typ, rhsTyp)
|
||||
|
||||
n.sons[1] = fitNode(c, le, rhs, n.info)
|
||||
if not newDestructors:
|
||||
if tfHasAsgn in lhs.typ.flags and not lhsIsResult and
|
||||
mode != noOverloadedAsgn:
|
||||
return overloadedAsgn(c, lhs, n.sons[1])
|
||||
else:
|
||||
liftTypeBoundOps(c, lhs.typ, lhs.info)
|
||||
|
||||
fixAbstractType(c, n)
|
||||
asgnToResultVar(c, n, n.sons[0], n.sons[1])
|
||||
@@ -1419,11 +1425,7 @@ proc semProcBody(c: PContext, n: PNode): PNode =
|
||||
openScope(c)
|
||||
result = semExpr(c, n)
|
||||
if c.p.resultSym != nil and not isEmptyType(result.typ):
|
||||
# transform ``expr`` to ``result = expr``, but not if the expr is already
|
||||
# ``result``:
|
||||
if result.kind == nkSym and result.sym == c.p.resultSym:
|
||||
discard
|
||||
elif result.kind == nkNilLit:
|
||||
if result.kind == nkNilLit:
|
||||
# or ImplicitlyDiscardable(result):
|
||||
# new semantic: 'result = x' triggers the void context
|
||||
result.typ = nil
|
||||
@@ -1456,14 +1458,15 @@ proc semYieldVarResult(c: PContext, n: PNode, restype: PType) =
|
||||
var t = skipTypes(restype, {tyGenericInst, tyAlias})
|
||||
case t.kind
|
||||
of tyVar:
|
||||
t.flags.incl tfVarIsPtr # bugfix for #4048, #4910, #6892
|
||||
if n.sons[0].kind in {nkHiddenStdConv, nkHiddenSubConv}:
|
||||
n.sons[0] = n.sons[0].sons[1]
|
||||
|
||||
n.sons[0] = takeImplicitAddr(c, n.sons[0])
|
||||
of tyTuple:
|
||||
for i in 0.. <t.sonsLen:
|
||||
for i in 0..<t.sonsLen:
|
||||
var e = skipTypes(t.sons[i], {tyGenericInst, tyAlias})
|
||||
if e.kind == tyVar:
|
||||
e.flags.incl tfVarIsPtr # bugfix for #4048, #4910, #6892
|
||||
if n.sons[0].kind == nkPar:
|
||||
n.sons[0].sons[i] = takeImplicitAddr(c, n.sons[0].sons[i])
|
||||
elif n.sons[0].kind in {nkHiddenStdConv, nkHiddenSubConv} and
|
||||
@@ -1654,7 +1657,7 @@ proc processQuotations(n: var PNode, op: string,
|
||||
elif n.kind == nkAccQuoted and op == "``":
|
||||
returnQuote n[0]
|
||||
|
||||
for i in 0 .. <n.safeLen:
|
||||
for i in 0 ..< n.safeLen:
|
||||
processQuotations(n.sons[i], op, quotes, ids)
|
||||
|
||||
proc semQuoteAst(c: PContext, n: PNode): PNode =
|
||||
@@ -1662,7 +1665,7 @@ proc semQuoteAst(c: PContext, n: PNode): PNode =
|
||||
# We transform the do block into a template with a param for
|
||||
# each interpolation. We'll pass this template to getAst.
|
||||
var
|
||||
quotedBlock = n{-1}
|
||||
quotedBlock = n[^1]
|
||||
op = if n.len == 3: expectString(c, n[1]) else: "``"
|
||||
quotes = newSeq[PNode](1)
|
||||
# the quotes will be added to a nkCall statement
|
||||
@@ -1782,6 +1785,13 @@ proc setMs(n: PNode, s: PSym): PNode =
|
||||
n.sons[0] = newSymNode(s)
|
||||
n.sons[0].info = n.info
|
||||
|
||||
proc extractImports(n: PNode; result: PNode) =
|
||||
if n.kind in {nkImportStmt, nkImportExceptStmt, nkFromStmt}:
|
||||
result.add copyTree(n)
|
||||
n.kind = nkEmpty
|
||||
return
|
||||
for i in 0..<n.safeLen: extractImports(n[i], result)
|
||||
|
||||
proc semMagic(c: PContext, n: PNode, s: PSym, flags: TExprFlags): PNode =
|
||||
# this is a hotspot in the compiler!
|
||||
# DON'T forget to update ast.SpecialSemMagics if you add a magic here!
|
||||
@@ -1822,7 +1832,7 @@ proc semMagic(c: PContext, n: PNode, s: PSym, flags: TExprFlags): PNode =
|
||||
dec c.inParallelStmt
|
||||
of mSpawn:
|
||||
result = setMs(n, s)
|
||||
for i in 1 .. <n.len:
|
||||
for i in 1 ..< n.len:
|
||||
result.sons[i] = semExpr(c, n.sons[i])
|
||||
let typ = result[^1].typ
|
||||
if not typ.isEmptyType:
|
||||
@@ -1853,6 +1863,21 @@ proc semMagic(c: PContext, n: PNode, s: PSym, flags: TExprFlags): PNode =
|
||||
analyseIfAddressTakenInCall(c, result)
|
||||
if callee.magic != mNone:
|
||||
result = magicsAfterOverloadResolution(c, result, flags)
|
||||
of mRunnableExamples:
|
||||
if gCmd == cmdDoc and n.len >= 2 and n.lastSon.kind == nkStmtList:
|
||||
if n.sons[0].kind == nkIdent:
|
||||
if sfMainModule in c.module.flags:
|
||||
let inp = toFullPath(c.module.info)
|
||||
if c.runnableExamples == nil:
|
||||
c.runnableExamples = newTree(nkStmtList,
|
||||
newTree(nkImportStmt, newStrNode(nkStrLit, expandFilename(inp))))
|
||||
let imports = newTree(nkStmtList)
|
||||
extractImports(n.lastSon, imports)
|
||||
for imp in imports: c.runnableExamples.add imp
|
||||
c.runnableExamples.add newTree(nkBlockStmt, emptyNode, copyTree n.lastSon)
|
||||
result = setMs(n, s)
|
||||
else:
|
||||
result = emptyNode
|
||||
else:
|
||||
result = semDirectOp(c, n, flags)
|
||||
|
||||
@@ -2073,7 +2098,7 @@ proc semBlock(c: PContext, n: PNode): PNode =
|
||||
proc semExport(c: PContext, n: PNode): PNode =
|
||||
var x = newNodeI(n.kind, n.info)
|
||||
#let L = if n.kind == nkExportExceptStmt: L = 1 else: n.len
|
||||
for i in 0.. <n.len:
|
||||
for i in 0..<n.len:
|
||||
let a = n.sons[i]
|
||||
var o: TOverloadIter
|
||||
var s = initOverloadIter(o, c, a)
|
||||
@@ -2104,7 +2129,7 @@ proc shouldBeBracketExpr(n: PNode): bool =
|
||||
let b = a[0]
|
||||
if b.kind in nkSymChoices:
|
||||
for i in 0..<b.len:
|
||||
if b[i].sym.magic == mArrGet:
|
||||
if b[i].kind == nkSym and b[i].sym.magic == mArrGet:
|
||||
let be = newNodeI(nkBracketExpr, n.info)
|
||||
for i in 1..<a.len: be.add(a[i])
|
||||
n.sons[0] = be
|
||||
@@ -2118,6 +2143,8 @@ proc semExpr(c: PContext, n: PNode, flags: TExprFlags = {}): PNode =
|
||||
of nkIdent, nkAccQuoted:
|
||||
let checks = if efNoEvaluateGeneric in flags:
|
||||
{checkUndeclared, checkPureEnumFields}
|
||||
elif efInCall in flags:
|
||||
{checkUndeclared, checkModule, checkPureEnumFields}
|
||||
else:
|
||||
{checkUndeclared, checkModule, checkAmbiguity, checkPureEnumFields}
|
||||
var s = qualifiedLookUp(c, n, checks)
|
||||
@@ -2212,10 +2239,10 @@ proc semExpr(c: PContext, n: PNode, flags: TExprFlags = {}): PNode =
|
||||
# XXX think about this more (``set`` procs)
|
||||
if n.len == 2:
|
||||
result = semConv(c, n)
|
||||
elif contains(c.ambiguousSymbols, s.id) and n.len == 1:
|
||||
errorUseQualifier(c, n.info, s)
|
||||
elif n.len == 1:
|
||||
result = semObjConstr(c, n, flags)
|
||||
elif contains(c.ambiguousSymbols, s.id):
|
||||
errorUseQualifier(c, n.info, s)
|
||||
elif s.magic == mNone: result = semDirectOp(c, n, flags)
|
||||
else: result = semMagic(c, n, s, flags)
|
||||
of skProc, skFunc, skMethod, skConverter, skIterator:
|
||||
@@ -2365,6 +2392,7 @@ proc semExpr(c: PContext, n: PNode, flags: TExprFlags = {}): PNode =
|
||||
if n.len != 1 and n.len != 2: illFormedAst(n)
|
||||
for i in 0 ..< n.len:
|
||||
n.sons[i] = semExpr(c, n.sons[i])
|
||||
of nkComesFrom: discard "ignore the comes from information for now"
|
||||
else:
|
||||
localError(n.info, errInvalidExpressionX,
|
||||
renderTree(n, {renderNoComments}))
|
||||
|
||||
@@ -89,7 +89,7 @@ proc semForObjectFields(c: TFieldsCtx, typ, forLoop, father: PNode) =
|
||||
access.sons[1] = newSymNode(typ.sons[0].sym, forLoop.info)
|
||||
caseStmt.add(semExprWithType(c.c, access))
|
||||
# copy the branches over, but replace the fields with the for loop body:
|
||||
for i in 1 .. <typ.len:
|
||||
for i in 1 ..< typ.len:
|
||||
var branch = copyTree(typ[i])
|
||||
let L = branch.len
|
||||
branch.sons[L-1] = newNodeI(nkStmtList, forLoop.info)
|
||||
|
||||
@@ -92,26 +92,6 @@ proc pickIntRange(a, b: PType): PType =
|
||||
proc isIntRangeOrLit(t: PType): bool =
|
||||
result = isIntRange(t) or isIntLit(t)
|
||||
|
||||
proc pickMinInt(n: PNode): BiggestInt =
|
||||
if n.kind in {nkIntLit..nkUInt64Lit}:
|
||||
result = n.intVal
|
||||
elif isIntLit(n.typ):
|
||||
result = n.typ.n.intVal
|
||||
elif isIntRange(n.typ):
|
||||
result = firstOrd(n.typ)
|
||||
else:
|
||||
internalError(n.info, "pickMinInt")
|
||||
|
||||
proc pickMaxInt(n: PNode): BiggestInt =
|
||||
if n.kind in {nkIntLit..nkUInt64Lit}:
|
||||
result = n.intVal
|
||||
elif isIntLit(n.typ):
|
||||
result = n.typ.n.intVal
|
||||
elif isIntRange(n.typ):
|
||||
result = lastOrd(n.typ)
|
||||
else:
|
||||
internalError(n.info, "pickMaxInt")
|
||||
|
||||
proc makeRange(typ: PType, first, last: BiggestInt): PType =
|
||||
let minA = min(first, last)
|
||||
let maxA = max(first, last)
|
||||
@@ -137,116 +117,6 @@ proc makeRangeF(typ: PType, first, last: BiggestFloat): PType =
|
||||
result.n = n
|
||||
addSonSkipIntLit(result, skipTypes(typ, {tyRange}))
|
||||
|
||||
proc getIntervalType*(m: TMagic, n: PNode): PType =
|
||||
# Nim requires interval arithmetic for ``range`` types. Lots of tedious
|
||||
# work but the feature is very nice for reducing explicit conversions.
|
||||
const ordIntLit = {nkIntLit..nkUInt64Lit}
|
||||
result = n.typ
|
||||
|
||||
template commutativeOp(opr: untyped) =
|
||||
let a = n.sons[1]
|
||||
let b = n.sons[2]
|
||||
if isIntRangeOrLit(a.typ) and isIntRangeOrLit(b.typ):
|
||||
result = makeRange(pickIntRange(a.typ, b.typ),
|
||||
opr(pickMinInt(a), pickMinInt(b)),
|
||||
opr(pickMaxInt(a), pickMaxInt(b)))
|
||||
|
||||
template binaryOp(opr: untyped) =
|
||||
let a = n.sons[1]
|
||||
let b = n.sons[2]
|
||||
if isIntRange(a.typ) and b.kind in {nkIntLit..nkUInt64Lit}:
|
||||
result = makeRange(a.typ,
|
||||
opr(pickMinInt(a), pickMinInt(b)),
|
||||
opr(pickMaxInt(a), pickMaxInt(b)))
|
||||
|
||||
case m
|
||||
of mUnaryMinusI, mUnaryMinusI64:
|
||||
let a = n.sons[1].typ
|
||||
if isIntRange(a):
|
||||
# (1..3) * (-1) == (-3.. -1)
|
||||
result = makeRange(a, 0|-|lastOrd(a), 0|-|firstOrd(a))
|
||||
of mUnaryMinusF64:
|
||||
let a = n.sons[1].typ
|
||||
if isFloatRange(a):
|
||||
result = makeRangeF(a, -getFloat(a.n.sons[1]),
|
||||
-getFloat(a.n.sons[0]))
|
||||
of mAbsF64:
|
||||
let a = n.sons[1].typ
|
||||
if isFloatRange(a):
|
||||
# abs(-5.. 1) == (1..5)
|
||||
if a.n[0].floatVal <= 0.0:
|
||||
result = makeRangeF(a, 0.0, abs(getFloat(a.n.sons[0])))
|
||||
else:
|
||||
result = makeRangeF(a, abs(getFloat(a.n.sons[1])),
|
||||
abs(getFloat(a.n.sons[0])))
|
||||
of mAbsI:
|
||||
let a = n.sons[1].typ
|
||||
if isIntRange(a):
|
||||
if a.n[0].intVal <= 0:
|
||||
result = makeRange(a, 0, `|abs|`(getInt(a.n.sons[0])))
|
||||
else:
|
||||
result = makeRange(a, `|abs|`(getInt(a.n.sons[1])),
|
||||
`|abs|`(getInt(a.n.sons[0])))
|
||||
of mSucc:
|
||||
let a = n.sons[1].typ
|
||||
let b = n.sons[2].typ
|
||||
if isIntRange(a) and isIntLit(b):
|
||||
# (-5.. 1) + 6 == (-5 + 6)..(-1 + 6)
|
||||
result = makeRange(a, pickMinInt(n.sons[1]) |+| pickMinInt(n.sons[2]),
|
||||
pickMaxInt(n.sons[1]) |+| pickMaxInt(n.sons[2]))
|
||||
of mPred:
|
||||
let a = n.sons[1].typ
|
||||
let b = n.sons[2].typ
|
||||
if isIntRange(a) and isIntLit(b):
|
||||
result = makeRange(a, pickMinInt(n.sons[1]) |-| pickMinInt(n.sons[2]),
|
||||
pickMaxInt(n.sons[1]) |-| pickMaxInt(n.sons[2]))
|
||||
of mAddI, mAddU:
|
||||
commutativeOp(`|+|`)
|
||||
of mMulI, mMulU:
|
||||
commutativeOp(`|*|`)
|
||||
of mSubI, mSubU:
|
||||
binaryOp(`|-|`)
|
||||
of mBitandI:
|
||||
# since uint64 is still not even valid for 'range' (since it's no ordinal
|
||||
# yet), we exclude it from the list (see bug #1638) for now:
|
||||
var a = n.sons[1]
|
||||
var b = n.sons[2]
|
||||
# symmetrical:
|
||||
if b.kind notin ordIntLit: swap(a, b)
|
||||
if b.kind in ordIntLit:
|
||||
let x = b.intVal|+|1
|
||||
if (x and -x) == x and x >= 0:
|
||||
result = makeRange(n.typ, 0, b.intVal)
|
||||
of mModU:
|
||||
let a = n.sons[1]
|
||||
let b = n.sons[2]
|
||||
if b.kind in ordIntLit:
|
||||
if b.intVal >= 0:
|
||||
result = makeRange(n.typ, 0, b.intVal-1)
|
||||
else:
|
||||
result = makeRange(n.typ, b.intVal+1, 0)
|
||||
of mModI:
|
||||
# so ... if you ever wondered about modulo's signedness; this defines it:
|
||||
let a = n.sons[1]
|
||||
let b = n.sons[2]
|
||||
if b.kind in {nkIntLit..nkUInt64Lit}:
|
||||
if b.intVal >= 0:
|
||||
result = makeRange(n.typ, -(b.intVal-1), b.intVal-1)
|
||||
else:
|
||||
result = makeRange(n.typ, b.intVal+1, -(b.intVal+1))
|
||||
of mDivI, mDivU:
|
||||
binaryOp(`|div|`)
|
||||
of mMinI:
|
||||
commutativeOp(min)
|
||||
of mMaxI:
|
||||
commutativeOp(max)
|
||||
else: discard
|
||||
|
||||
discard """
|
||||
mShlI,
|
||||
mShrI, mAddF64, mSubF64, mMulF64, mDivF64, mMaxF64, mMinF64
|
||||
"""
|
||||
|
||||
proc evalIs(n, a: PNode): PNode =
|
||||
# XXX: This should use the standard isOpImpl
|
||||
internalAssert a.kind == nkSym and a.sym.kind == skType
|
||||
@@ -736,13 +606,13 @@ proc getConstExpr(m: PSym, n: PNode): PNode =
|
||||
if a == nil: return nil
|
||||
result.sons[i] = a
|
||||
incl(result.flags, nfAllConst)
|
||||
of nkObjConstr:
|
||||
result = copyTree(n)
|
||||
for i in countup(1, sonsLen(n) - 1):
|
||||
var a = getConstExpr(m, n.sons[i].sons[1])
|
||||
if a == nil: return nil
|
||||
result.sons[i].sons[1] = a
|
||||
incl(result.flags, nfAllConst)
|
||||
#of nkObjConstr:
|
||||
# result = copyTree(n)
|
||||
# for i in countup(1, sonsLen(n) - 1):
|
||||
# var a = getConstExpr(m, n.sons[i].sons[1])
|
||||
# if a == nil: return nil
|
||||
# result.sons[i].sons[1] = a
|
||||
# incl(result.flags, nfAllConst)
|
||||
of nkPar:
|
||||
# tuple constructor
|
||||
result = copyTree(n)
|
||||
@@ -785,5 +655,8 @@ proc getConstExpr(m: PSym, n: PNode): PNode =
|
||||
result.typ = n.typ
|
||||
of nkBracketExpr: result = foldArrayAccess(m, n)
|
||||
of nkDotExpr: result = foldFieldAccess(m, n)
|
||||
of nkStmtListExpr:
|
||||
if n.len == 2 and n[0].kind == nkComesFrom:
|
||||
result = getConstExpr(m, n[1])
|
||||
else:
|
||||
discard
|
||||
|
||||
@@ -186,7 +186,7 @@ proc semGenericStmt(c: PContext, n: PNode,
|
||||
let a = n.sym
|
||||
let b = getGenSym(c, a)
|
||||
if b != a: n.sym = b
|
||||
of nkEmpty, succ(nkSym)..nkNilLit:
|
||||
of nkEmpty, succ(nkSym)..nkNilLit, nkComesFrom:
|
||||
# see tests/compile/tgensymgeneric.nim:
|
||||
# We need to open the gensym'ed symbol again so that the instantiation
|
||||
# creates a fresh copy; but this is wrong the very first reason for gensym
|
||||
@@ -210,7 +210,7 @@ proc semGenericStmt(c: PContext, n: PNode,
|
||||
considerQuotedIdent(fn).id notin ctx.toMixin:
|
||||
errorUndeclaredIdentifier(c, n.info, fn.renderTree)
|
||||
|
||||
var first = ord(withinConcept in flags)
|
||||
var first = int ord(withinConcept in flags)
|
||||
var mixinContext = false
|
||||
if s != nil:
|
||||
incl(s.flags, sfUsed)
|
||||
@@ -335,8 +335,10 @@ proc semGenericStmt(c: PContext, n: PNode,
|
||||
n.sons[L - 2] = semGenericStmt(c, n.sons[L-2], flags, ctx)
|
||||
for i in countup(0, L - 3):
|
||||
addTempDecl(c, n.sons[i], skForVar)
|
||||
openScope(c)
|
||||
n.sons[L - 1] = semGenericStmt(c, n.sons[L-1], flags, ctx)
|
||||
closeScope(c)
|
||||
closeScope(c)
|
||||
of nkBlockStmt, nkBlockExpr, nkBlockType:
|
||||
checkSonsLen(n, 2)
|
||||
openScope(c)
|
||||
|
||||
@@ -121,14 +121,14 @@ proc freshGenSyms(n: PNode, owner, orig: PSym, symMap: var TIdTable) =
|
||||
idTablePut(symMap, s, x)
|
||||
n.sym = x
|
||||
else:
|
||||
for i in 0 .. <safeLen(n): freshGenSyms(n.sons[i], owner, orig, symMap)
|
||||
for i in 0 ..< safeLen(n): freshGenSyms(n.sons[i], owner, orig, symMap)
|
||||
|
||||
proc addParamOrResult(c: PContext, param: PSym, kind: TSymKind)
|
||||
|
||||
proc instantiateBody(c: PContext, n, params: PNode, result, orig: PSym) =
|
||||
if n.sons[bodyPos].kind != nkEmpty:
|
||||
let procParams = result.typ.n
|
||||
for i in 1 .. <procParams.len:
|
||||
for i in 1 ..< procParams.len:
|
||||
addDecl(c, procParams[i].sym)
|
||||
maybeAddResult(c, result, result.ast)
|
||||
|
||||
@@ -138,7 +138,7 @@ proc instantiateBody(c: PContext, n, params: PNode, result, orig: PSym) =
|
||||
var symMap: TIdTable
|
||||
initIdTable symMap
|
||||
if params != nil:
|
||||
for i in 1 .. <params.len:
|
||||
for i in 1 ..< params.len:
|
||||
let param = params[i].sym
|
||||
if sfGenSym in param.flags:
|
||||
idTablePut(symMap, params[i].sym, result.typ.n[param.position+1].sym)
|
||||
@@ -211,7 +211,7 @@ proc instantiateProcType(c: PContext, pt: TIdTable,
|
||||
let originalParams = result.n
|
||||
result.n = originalParams.shallowCopy
|
||||
|
||||
for i in 1 .. <result.len:
|
||||
for i in 1 ..< result.len:
|
||||
# twrong_field_caching requires these 'resetIdTable' calls:
|
||||
if i > 1:
|
||||
resetIdTable(cl.symMap)
|
||||
@@ -240,6 +240,8 @@ proc instantiateProcType(c: PContext, pt: TIdTable,
|
||||
resetIdTable(cl.localCache)
|
||||
result.sons[0] = replaceTypeVarsT(cl, result.sons[0])
|
||||
result.n.sons[0] = originalParams[0].copyTree
|
||||
if result.sons[0] != nil:
|
||||
propagateToOwner(result, result.sons[0])
|
||||
|
||||
eraseVoidParams(result)
|
||||
skipIntLiteralParams(result)
|
||||
|
||||
@@ -42,7 +42,7 @@ proc annotateType*(n: PNode, t: PType) =
|
||||
of nkObjConstr:
|
||||
let x = t.skipTypes(abstractPtrs)
|
||||
n.typ = t
|
||||
for i in 1 .. <n.len:
|
||||
for i in 1 ..< n.len:
|
||||
var j = i-1
|
||||
let field = x.n.ithField(j)
|
||||
if field.isNil:
|
||||
@@ -53,7 +53,7 @@ proc annotateType*(n: PNode, t: PType) =
|
||||
of nkPar:
|
||||
if x.kind == tyTuple:
|
||||
n.typ = t
|
||||
for i in 0 .. <n.len:
|
||||
for i in 0 ..< n.len:
|
||||
if i >= x.len: globalError n.info, "invalid field at index " & $i
|
||||
else: annotateType(n.sons[i], x.sons[i])
|
||||
elif x.kind == tyProc and x.callConv == ccClosure:
|
||||
|
||||
@@ -38,9 +38,7 @@ proc skipAddr(n: PNode): PNode {.inline.} =
|
||||
proc semArrGet(c: PContext; n: PNode; flags: TExprFlags): PNode =
|
||||
result = newNodeI(nkBracketExpr, n.info)
|
||||
for i in 1..<n.len: result.add(n[i])
|
||||
let oldBracketExpr = c.p.bracketExpr
|
||||
result = semSubscript(c, result, flags)
|
||||
c.p.bracketExpr = oldBracketExpr
|
||||
if result.isNil:
|
||||
let x = copyTree(n)
|
||||
x.sons[0] = newIdentNode(getIdent"[]", n.info)
|
||||
@@ -129,7 +127,7 @@ proc evalTypeTrait(traitCall: PNode, operand: PType, context: PSym): PNode =
|
||||
of "not":
|
||||
return typeWithSonsResult(tyNot, @[operand])
|
||||
of "name":
|
||||
result = newStrNode(nkStrLit, operand.typeToString(preferName))
|
||||
result = newStrNode(nkStrLit, operand.typeToString(preferTypeName))
|
||||
result.typ = newType(tyString, context)
|
||||
result.info = traitCall.info
|
||||
of "arity":
|
||||
@@ -146,8 +144,14 @@ proc evalTypeTrait(traitCall: PNode, operand: PType, context: PSym): PNode =
|
||||
result = res.base.toNode(traitCall.info)
|
||||
of "stripGenericParams":
|
||||
result = uninstantiate(operand).toNode(traitCall.info)
|
||||
of "supportsCopyMem":
|
||||
let t = operand.skipTypes({tyVar, tyGenericInst, tyAlias, tyInferred})
|
||||
let complexObj = containsGarbageCollectedRef(t) or
|
||||
hasDestructor(t)
|
||||
result = newIntNodeT(ord(not complexObj), traitCall)
|
||||
else:
|
||||
internalAssert false
|
||||
localError(traitCall.info, "unknown trait")
|
||||
result = emptyNode
|
||||
|
||||
proc semTypeTraits(c: PContext, n: PNode): PNode =
|
||||
checkMinSonsLen(n, 2)
|
||||
@@ -164,7 +168,9 @@ proc semTypeTraits(c: PContext, n: PNode): PNode =
|
||||
proc semOrd(c: PContext, n: PNode): PNode =
|
||||
result = n
|
||||
let parType = n.sons[1].typ
|
||||
if isOrdinalType(parType) or parType.kind == tySet:
|
||||
if isOrdinalType(parType):
|
||||
discard
|
||||
elif parType.kind == tySet:
|
||||
result.typ = makeRangeType(c, firstOrd(parType), lastOrd(parType), n.info)
|
||||
else:
|
||||
localError(n.info, errOrdinalTypeExpected)
|
||||
@@ -246,7 +252,11 @@ proc magicsAfterOverloadResolution(c: PContext, n: PNode,
|
||||
result = semTypeOf(c, n.sons[1])
|
||||
of mArrGet: result = semArrGet(c, n, flags)
|
||||
of mArrPut: result = semArrPut(c, n, flags)
|
||||
of mAsgn: result = semAsgnOpr(c, n)
|
||||
of mAsgn:
|
||||
if n[0].sym.name.s == "=":
|
||||
result = semAsgnOpr(c, n)
|
||||
else:
|
||||
result = n
|
||||
of mIsPartOf: result = semIsPartOf(c, n, flags)
|
||||
of mTypeTrait: result = semTypeTraits(c, n)
|
||||
of mAstToStr:
|
||||
@@ -264,35 +274,7 @@ proc magicsAfterOverloadResolution(c: PContext, n: PNode,
|
||||
of mDotDot:
|
||||
result = n
|
||||
of mRoof:
|
||||
let bracketExpr = if n.len == 3: n.sons[2] else: c.p.bracketExpr
|
||||
if bracketExpr.isNil:
|
||||
localError(n.info, "no surrounding array access context for '^'")
|
||||
result = n.sons[1]
|
||||
elif bracketExpr.checkForSideEffects != seNoSideEffect:
|
||||
localError(n.info, "invalid context for '^' as '$#' has side effects" %
|
||||
renderTree(bracketExpr))
|
||||
result = n.sons[1]
|
||||
elif bracketExpr.typ.isStrangeArray:
|
||||
localError(n.info, "invalid context for '^' as len!=high+1 for '$#'" %
|
||||
renderTree(bracketExpr))
|
||||
result = n.sons[1]
|
||||
else:
|
||||
# ^x is rewritten to: len(a)-x
|
||||
let lenExpr = newNodeI(nkCall, n.info)
|
||||
lenExpr.add newIdentNode(getIdent"len", n.info)
|
||||
lenExpr.add bracketExpr
|
||||
let lenExprB = semExprWithType(c, lenExpr)
|
||||
if lenExprB.typ.isNil or not isOrdinalType(lenExprB.typ):
|
||||
localError(n.info, "'$#' has to be of an ordinal type for '^'" %
|
||||
renderTree(lenExpr))
|
||||
result = n.sons[1]
|
||||
else:
|
||||
result = newNodeIT(nkCall, n.info, getSysType(tyInt))
|
||||
let subi = getSysMagic("-", mSubI)
|
||||
#echo "got ", typeToString(subi.typ)
|
||||
result.add newSymNode(subi, n.info)
|
||||
result.add lenExprB
|
||||
result.add n.sons[1]
|
||||
localError(n.info, "builtin roof operator is not supported anymore")
|
||||
of mPlugin:
|
||||
let plugin = getPlugin(n[0].sym)
|
||||
if plugin.isNil:
|
||||
|
||||
@@ -39,13 +39,19 @@ proc mergeInitStatus(existing: var InitStatus, newStatus: InitStatus) =
|
||||
of initUnknown:
|
||||
discard
|
||||
|
||||
proc invalidObjConstr(n: PNode) =
|
||||
if n.kind == nkInfix and n[0].kind == nkIdent and n[0].ident.s[0] == ':':
|
||||
localError(n.info, "incorrect object construction syntax; use a space after the colon")
|
||||
else:
|
||||
localError(n.info, "incorrect object construction syntax")
|
||||
|
||||
proc locateFieldInInitExpr(field: PSym, initExpr: PNode): PNode =
|
||||
# Returns the assignment nkExprColonExpr node or nil
|
||||
let fieldId = field.name.id
|
||||
for i in 1 .. <initExpr.len:
|
||||
for i in 1 ..< initExpr.len:
|
||||
let assignment = initExpr[i]
|
||||
if assignment.kind != nkExprColonExpr:
|
||||
localError(initExpr.info, "incorrect object construction syntax")
|
||||
invalidObjConstr(assignment)
|
||||
continue
|
||||
|
||||
if fieldId == considerQuotedIdent(assignment[0]).id:
|
||||
@@ -78,13 +84,13 @@ proc caseBranchMatchesExpr(branch, matched: PNode): bool =
|
||||
|
||||
proc pickCaseBranch(caseExpr, matched: PNode): PNode =
|
||||
# XXX: Perhaps this proc already exists somewhere
|
||||
let endsWithElse = caseExpr{-1}.kind == nkElse
|
||||
let endsWithElse = caseExpr[^1].kind == nkElse
|
||||
for i in 1 .. caseExpr.len - 1 - int(endsWithElse):
|
||||
if caseExpr[i].caseBranchMatchesExpr(matched):
|
||||
return caseExpr[i]
|
||||
|
||||
if endsWithElse:
|
||||
return caseExpr{-1}
|
||||
return caseExpr[^1]
|
||||
|
||||
iterator directFieldsInRecList(recList: PNode): PNode =
|
||||
# XXX: We can remove this case by making all nkOfBranch nodes
|
||||
@@ -136,17 +142,20 @@ proc semConstructFields(c: PContext, recNode: PNode,
|
||||
|
||||
of nkRecCase:
|
||||
template fieldsPresentInBranch(branchIdx: int): string =
|
||||
fieldsPresentInInitExpr(recNode[branchIdx]{-1}, initExpr)
|
||||
let branch = recNode[branchIdx]
|
||||
let fields = branch[branch.len - 1]
|
||||
fieldsPresentInInitExpr(fields, initExpr)
|
||||
|
||||
template checkMissingFields(branchNode: PNode) =
|
||||
checkForMissingFields(branchNode{-1}, initExpr)
|
||||
let fields = branchNode[branchNode.len - 1]
|
||||
checkForMissingFields(fields, initExpr)
|
||||
|
||||
let discriminator = recNode.sons[0];
|
||||
internalAssert discriminator.kind == nkSym
|
||||
var selectedBranch = -1
|
||||
|
||||
for i in 1 .. <recNode.len:
|
||||
let innerRecords = recNode[i]{-1}
|
||||
for i in 1 ..< recNode.len:
|
||||
let innerRecords = recNode[i][^1]
|
||||
let status = semConstructFields(c, innerRecords, initExpr, flags)
|
||||
if status notin {initNone, initUnknown}:
|
||||
mergeInitStatus(result, status)
|
||||
@@ -220,7 +229,7 @@ proc semConstructFields(c: PContext, recNode: PNode,
|
||||
else:
|
||||
# All bets are off. If any of the branches has a mandatory
|
||||
# fields we must produce an error:
|
||||
for i in 1 .. <recNode.len: checkMissingFields recNode[i]
|
||||
for i in 1 ..< recNode.len: checkMissingFields recNode[i]
|
||||
|
||||
of nkSym:
|
||||
let field = recNode.sym
|
||||
@@ -250,7 +259,7 @@ proc semObjConstr(c: PContext, n: PNode, flags: TExprFlags): PNode =
|
||||
var t = semTypeNode(c, n.sons[0], nil)
|
||||
result = newNodeIT(nkObjConstr, n.info, t)
|
||||
for child in n: result.add child
|
||||
|
||||
|
||||
t = skipTypes(t, {tyGenericInst, tyAlias})
|
||||
if t.kind == tyRef: t = skipTypes(t.sons[0], {tyGenericInst, tyAlias})
|
||||
if t.kind != tyObject:
|
||||
@@ -277,16 +286,16 @@ proc semObjConstr(c: PContext, n: PNode, flags: TExprFlags): PNode =
|
||||
# Since we were traversing the object fields, it's possible that
|
||||
# not all of the fields specified in the constructor was visited.
|
||||
# We'll check for such fields here:
|
||||
for i in 1.. <result.len:
|
||||
for i in 1..<result.len:
|
||||
let field = result[i]
|
||||
if nfSem notin field.flags:
|
||||
if field.kind != nkExprColonExpr:
|
||||
localError(n.info, "incorrect object construction syntax")
|
||||
invalidObjConstr(field)
|
||||
continue
|
||||
let id = considerQuotedIdent(field[0])
|
||||
# This node was not processed. There are two possible reasons:
|
||||
# 1) It was shadowed by a field with the same name on the left
|
||||
for j in 1 .. <i:
|
||||
for j in 1 ..< i:
|
||||
let prevId = considerQuotedIdent(result[j][0])
|
||||
if prevId.id == id.id:
|
||||
localError(field.info, errFieldInitTwice, id.s)
|
||||
|
||||
@@ -81,7 +81,7 @@ proc initAnalysisCtx(): AnalysisCtx =
|
||||
result.guards = @[]
|
||||
|
||||
proc lookupSlot(c: AnalysisCtx; s: PSym): int =
|
||||
for i in 0.. <c.locals.len:
|
||||
for i in 0..<c.locals.len:
|
||||
if c.locals[i].v == s or c.locals[i].alias == s: return i
|
||||
return -1
|
||||
|
||||
@@ -94,7 +94,7 @@ proc getSlot(c: var AnalysisCtx; v: PSym): ptr MonotonicVar =
|
||||
return addr(c.locals[L])
|
||||
|
||||
proc gatherArgs(c: var AnalysisCtx; n: PNode) =
|
||||
for i in 0.. <n.safeLen:
|
||||
for i in 0..<n.safeLen:
|
||||
let root = getRoot n[i]
|
||||
if root != nil:
|
||||
block addRoot:
|
||||
@@ -119,7 +119,7 @@ proc checkLocal(c: AnalysisCtx; n: PNode) =
|
||||
if s >= 0 and c.locals[s].stride != nil:
|
||||
localError(n.info, "invalid usage of counter after increment")
|
||||
else:
|
||||
for i in 0 .. <n.safeLen: checkLocal(c, n.sons[i])
|
||||
for i in 0 ..< n.safeLen: checkLocal(c, n.sons[i])
|
||||
|
||||
template `?`(x): untyped = x.renderTree
|
||||
|
||||
@@ -180,7 +180,7 @@ proc stride(c: AnalysisCtx; n: PNode): BiggestInt =
|
||||
if s >= 0 and c.locals[s].stride != nil:
|
||||
result = c.locals[s].stride.intVal
|
||||
else:
|
||||
for i in 0 .. <n.safeLen: result += stride(c, n.sons[i])
|
||||
for i in 0 ..< n.safeLen: result += stride(c, n.sons[i])
|
||||
|
||||
proc subStride(c: AnalysisCtx; n: PNode): PNode =
|
||||
# substitute with stride:
|
||||
@@ -192,7 +192,7 @@ proc subStride(c: AnalysisCtx; n: PNode): PNode =
|
||||
result = n
|
||||
elif n.safeLen > 0:
|
||||
result = shallowCopy(n)
|
||||
for i in 0 .. <n.len: result.sons[i] = subStride(c, n.sons[i])
|
||||
for i in 0 ..< n.len: result.sons[i] = subStride(c, n.sons[i])
|
||||
else:
|
||||
result = n
|
||||
|
||||
@@ -251,7 +251,7 @@ proc checkSlicesAreDisjoint(c: var AnalysisCtx) =
|
||||
proc analyse(c: var AnalysisCtx; n: PNode)
|
||||
|
||||
proc analyseSons(c: var AnalysisCtx; n: PNode) =
|
||||
for i in 0 .. <safeLen(n): analyse(c, n[i])
|
||||
for i in 0 ..< safeLen(n): analyse(c, n[i])
|
||||
|
||||
proc min(a, b: PNode): PNode =
|
||||
if a.isNil: result = b
|
||||
@@ -293,11 +293,11 @@ proc analyseCall(c: var AnalysisCtx; n: PNode; op: PSym) =
|
||||
proc analyseCase(c: var AnalysisCtx; n: PNode) =
|
||||
analyse(c, n.sons[0])
|
||||
let oldFacts = c.guards.len
|
||||
for i in 1.. <n.len:
|
||||
for i in 1..<n.len:
|
||||
let branch = n.sons[i]
|
||||
setLen(c.guards, oldFacts)
|
||||
addCaseBranchFacts(c.guards, n, i)
|
||||
for i in 0 .. <branch.len:
|
||||
for i in 0 ..< branch.len:
|
||||
analyse(c, branch.sons[i])
|
||||
setLen(c.guards, oldFacts)
|
||||
|
||||
@@ -307,14 +307,14 @@ proc analyseIf(c: var AnalysisCtx; n: PNode) =
|
||||
addFact(c.guards, canon(n.sons[0].sons[0]))
|
||||
|
||||
analyse(c, n.sons[0].sons[1])
|
||||
for i in 1.. <n.len:
|
||||
for i in 1..<n.len:
|
||||
let branch = n.sons[i]
|
||||
setLen(c.guards, oldFacts)
|
||||
for j in 0..i-1:
|
||||
addFactNeg(c.guards, canon(n.sons[j].sons[0]))
|
||||
if branch.len > 1:
|
||||
addFact(c.guards, canon(branch.sons[0]))
|
||||
for i in 0 .. <branch.len:
|
||||
for i in 0 ..< branch.len:
|
||||
analyse(c, branch.sons[i])
|
||||
setLen(c.guards, oldFacts)
|
||||
|
||||
@@ -407,7 +407,7 @@ proc transformSlices(n: PNode): PNode =
|
||||
return result
|
||||
if n.safeLen > 0:
|
||||
result = shallowCopy(n)
|
||||
for i in 0 .. < n.len:
|
||||
for i in 0 ..< n.len:
|
||||
result.sons[i] = transformSlices(n.sons[i])
|
||||
else:
|
||||
result = n
|
||||
@@ -415,7 +415,7 @@ proc transformSlices(n: PNode): PNode =
|
||||
proc transformSpawn(owner: PSym; n, barrier: PNode): PNode
|
||||
proc transformSpawnSons(owner: PSym; n, barrier: PNode): PNode =
|
||||
result = shallowCopy(n)
|
||||
for i in 0 .. < n.len:
|
||||
for i in 0 ..< n.len:
|
||||
result.sons[i] = transformSpawn(owner, n.sons[i], barrier)
|
||||
|
||||
proc transformSpawn(owner: PSym; n, barrier: PNode): PNode =
|
||||
|
||||
@@ -248,7 +248,7 @@ type
|
||||
TIntersection = seq[tuple[id, count: int]] # a simple count table
|
||||
|
||||
proc addToIntersection(inter: var TIntersection, s: int) =
|
||||
for j in 0.. <inter.len:
|
||||
for j in 0..<inter.len:
|
||||
if s == inter[j].id:
|
||||
inc inter[j].count
|
||||
return
|
||||
@@ -282,7 +282,7 @@ proc createTag(n: PNode): PNode =
|
||||
proc addEffect(a: PEffects, e: PNode, useLineInfo=true) =
|
||||
assert e.kind != nkRaiseStmt
|
||||
var aa = a.exc
|
||||
for i in a.bottom .. <aa.len:
|
||||
for i in a.bottom ..< aa.len:
|
||||
if sameType(aa[i].excType, e.excType):
|
||||
if not useLineInfo or gCmd == cmdDoc: return
|
||||
elif aa[i].info == e.info: return
|
||||
@@ -290,7 +290,7 @@ proc addEffect(a: PEffects, e: PNode, useLineInfo=true) =
|
||||
|
||||
proc addTag(a: PEffects, e: PNode, useLineInfo=true) =
|
||||
var aa = a.tags
|
||||
for i in 0 .. <aa.len:
|
||||
for i in 0 ..< aa.len:
|
||||
if sameType(aa[i].typ.skipTypes(skipPtrs), e.typ.skipTypes(skipPtrs)):
|
||||
if not useLineInfo or gCmd == cmdDoc: return
|
||||
elif aa[i].info == e.info: return
|
||||
@@ -345,12 +345,12 @@ proc trackTryStmt(tracked: PEffects, n: PNode) =
|
||||
inc tracked.inTryStmt
|
||||
track(tracked, n.sons[0])
|
||||
dec tracked.inTryStmt
|
||||
for i in oldState.. <tracked.init.len:
|
||||
for i in oldState..<tracked.init.len:
|
||||
addToIntersection(inter, tracked.init[i])
|
||||
|
||||
var branches = 1
|
||||
var hasFinally = false
|
||||
for i in 1 .. < n.len:
|
||||
for i in 1 ..< n.len:
|
||||
let b = n.sons[i]
|
||||
let blen = sonsLen(b)
|
||||
if b.kind == nkExceptBranch:
|
||||
@@ -364,7 +364,7 @@ proc trackTryStmt(tracked: PEffects, n: PNode) =
|
||||
|
||||
setLen(tracked.init, oldState)
|
||||
track(tracked, b.sons[blen-1])
|
||||
for i in oldState.. <tracked.init.len:
|
||||
for i in oldState..<tracked.init.len:
|
||||
addToIntersection(inter, tracked.init[i])
|
||||
else:
|
||||
assert b.kind == nkFinally
|
||||
@@ -420,7 +420,7 @@ proc documentEffect(n, x: PNode, effectType: TSpecialWord, idx: int): PNode =
|
||||
|
||||
# warning: hack ahead:
|
||||
var effects = newNodeI(nkBracket, n.info, real.len)
|
||||
for i in 0 .. <real.len:
|
||||
for i in 0 ..< real.len:
|
||||
var t = typeToString(real[i].typ)
|
||||
if t.startsWith("ref "): t = substr(t, 4)
|
||||
effects.sons[i] = newIdentNode(getIdent(t), n.info)
|
||||
@@ -518,13 +518,15 @@ proc notNilCheck(tracked: PEffects, n: PNode, paramType: PType) =
|
||||
procVarcheck skipConvAndClosure(n)
|
||||
#elif n.kind in nkSymChoices:
|
||||
# echo "came here"
|
||||
let paramType = paramType.skipTypesOrNil(abstractInst)
|
||||
if paramType != nil and tfNotNil in paramType.flags and
|
||||
n.typ != nil and tfNotNil notin n.typ.flags:
|
||||
if n.kind == nkAddr:
|
||||
# addr(x[]) can't be proven, but addr(x) can:
|
||||
if not containsNode(n, {nkDerefExpr, nkHiddenDeref}): return
|
||||
elif (n.kind == nkSym and n.sym.kind in routineKinds) or
|
||||
n.kind in procDefs+{nkObjConstr, nkBracket}:
|
||||
(n.kind in procDefs+{nkObjConstr, nkBracket, nkClosure, nkStrLit..nkTripleStrLit}) or
|
||||
(n.kind in nkCallKinds and n[0].kind == nkSym and n[0].sym.magic == mArrToSeq):
|
||||
# 'p' is not nil obviously:
|
||||
return
|
||||
case impliesNotNil(tracked.guards, n)
|
||||
@@ -613,16 +615,16 @@ proc trackCase(tracked: PEffects, n: PNode) =
|
||||
warnProveField in gNotes
|
||||
var inter: TIntersection = @[]
|
||||
var toCover = 0
|
||||
for i in 1.. <n.len:
|
||||
for i in 1..<n.len:
|
||||
let branch = n.sons[i]
|
||||
setLen(tracked.init, oldState)
|
||||
if interesting:
|
||||
setLen(tracked.guards, oldFacts)
|
||||
addCaseBranchFacts(tracked.guards, n, i)
|
||||
for i in 0 .. <branch.len:
|
||||
for i in 0 ..< branch.len:
|
||||
track(tracked, branch.sons[i])
|
||||
if not breaksBlock(branch.lastSon): inc toCover
|
||||
for i in oldState.. <tracked.init.len:
|
||||
for i in oldState..<tracked.init.len:
|
||||
addToIntersection(inter, tracked.init[i])
|
||||
|
||||
setLen(tracked.init, oldState)
|
||||
@@ -642,10 +644,10 @@ proc trackIf(tracked: PEffects, n: PNode) =
|
||||
var toCover = 0
|
||||
track(tracked, n.sons[0].sons[1])
|
||||
if not breaksBlock(n.sons[0].sons[1]): inc toCover
|
||||
for i in oldState.. <tracked.init.len:
|
||||
for i in oldState..<tracked.init.len:
|
||||
addToIntersection(inter, tracked.init[i])
|
||||
|
||||
for i in 1.. <n.len:
|
||||
for i in 1..<n.len:
|
||||
let branch = n.sons[i]
|
||||
setLen(tracked.guards, oldFacts)
|
||||
for j in 0..i-1:
|
||||
@@ -653,10 +655,10 @@ proc trackIf(tracked: PEffects, n: PNode) =
|
||||
if branch.len > 1:
|
||||
addFact(tracked.guards, branch.sons[0])
|
||||
setLen(tracked.init, oldState)
|
||||
for i in 0 .. <branch.len:
|
||||
for i in 0 ..< branch.len:
|
||||
track(tracked, branch.sons[i])
|
||||
if not breaksBlock(branch.lastSon): inc toCover
|
||||
for i in oldState.. <tracked.init.len:
|
||||
for i in oldState..<tracked.init.len:
|
||||
addToIntersection(inter, tracked.init[i])
|
||||
setLen(tracked.init, oldState)
|
||||
if lastSon(n).len == 1:
|
||||
@@ -668,7 +670,7 @@ proc trackIf(tracked: PEffects, n: PNode) =
|
||||
proc trackBlock(tracked: PEffects, n: PNode) =
|
||||
if n.kind in {nkStmtList, nkStmtListExpr}:
|
||||
var oldState = -1
|
||||
for i in 0.. <n.len:
|
||||
for i in 0..<n.len:
|
||||
if hasSubnodeWith(n.sons[i], nkBreakStmt):
|
||||
# block:
|
||||
# x = def
|
||||
@@ -701,7 +703,7 @@ proc track(tracked: PEffects, n: PNode) =
|
||||
n.sons[0].info = n.info
|
||||
#throws(tracked.exc, n.sons[0])
|
||||
addEffect(tracked, n.sons[0], useLineInfo=false)
|
||||
for i in 0 .. <safeLen(n):
|
||||
for i in 0 ..< safeLen(n):
|
||||
track(tracked, n.sons[i])
|
||||
of nkCallKinds:
|
||||
# p's effects are ours too:
|
||||
@@ -752,11 +754,11 @@ proc track(tracked: PEffects, n: PNode) =
|
||||
discard
|
||||
else:
|
||||
message(arg.info, warnProveInit, $arg)
|
||||
for i in 0 .. <safeLen(n):
|
||||
for i in 0 ..< safeLen(n):
|
||||
track(tracked, n.sons[i])
|
||||
of nkDotExpr:
|
||||
guardDotAccess(tracked, n)
|
||||
for i in 0 .. <len(n): track(tracked, n.sons[i])
|
||||
for i in 0 ..< len(n): track(tracked, n.sons[i])
|
||||
of nkCheckedFieldExpr:
|
||||
track(tracked, n.sons[0])
|
||||
if warnProveField in gNotes: checkFieldAccess(tracked.guards, n)
|
||||
@@ -804,13 +806,13 @@ proc track(tracked: PEffects, n: PNode) =
|
||||
of nkForStmt, nkParForStmt:
|
||||
# we are very conservative here and assume the loop is never executed:
|
||||
let oldState = tracked.init.len
|
||||
for i in 0 .. <len(n):
|
||||
for i in 0 ..< len(n):
|
||||
track(tracked, n.sons[i])
|
||||
setLen(tracked.init, oldState)
|
||||
of nkObjConstr:
|
||||
when false: track(tracked, n.sons[0])
|
||||
let oldFacts = tracked.guards.len
|
||||
for i in 1 .. <len(n):
|
||||
for i in 1 ..< len(n):
|
||||
let x = n.sons[i]
|
||||
track(tracked, x)
|
||||
if x.sons[0].kind == nkSym and sfDiscriminant in x.sons[0].sym.flags:
|
||||
@@ -821,7 +823,7 @@ proc track(tracked: PEffects, n: PNode) =
|
||||
let oldLocked = tracked.locked.len
|
||||
let oldLockLevel = tracked.currLockLevel
|
||||
var enforcedGcSafety = false
|
||||
for i in 0 .. <pragmaList.len:
|
||||
for i in 0 ..< pragmaList.len:
|
||||
let pragma = whichPragma(pragmaList.sons[i])
|
||||
if pragma == wLocks:
|
||||
lockLocations(tracked, pragmaList.sons[i])
|
||||
@@ -840,7 +842,7 @@ proc track(tracked: PEffects, n: PNode) =
|
||||
of nkObjUpConv, nkObjDownConv, nkChckRange, nkChckRangeF, nkChckRange64:
|
||||
if n.len == 1: track(tracked, n.sons[0])
|
||||
else:
|
||||
for i in 0 .. <safeLen(n): track(tracked, n.sons[i])
|
||||
for i in 0 ..< safeLen(n): track(tracked, n.sons[i])
|
||||
|
||||
proc subtypeRelation(spec, real: PNode): bool =
|
||||
result = safeInheritanceDiff(real.excType, spec.typ) <= 0
|
||||
@@ -852,7 +854,7 @@ proc checkRaisesSpec(spec, real: PNode, msg: string, hints: bool;
|
||||
var used = initIntSet()
|
||||
for r in items(real):
|
||||
block search:
|
||||
for s in 0 .. <spec.len:
|
||||
for s in 0 ..< spec.len:
|
||||
if effectPredicate(spec[s], r):
|
||||
used.incl(s)
|
||||
break search
|
||||
@@ -862,7 +864,7 @@ proc checkRaisesSpec(spec, real: PNode, msg: string, hints: bool;
|
||||
popInfoContext()
|
||||
# hint about unnecessarily listed exception types:
|
||||
if hints:
|
||||
for s in 0 .. <spec.len:
|
||||
for s in 0 ..< spec.len:
|
||||
if not used.contains(s):
|
||||
message(spec[s].info, hintXDeclaredButNotUsed, renderTree(spec[s]))
|
||||
|
||||
@@ -977,10 +979,10 @@ proc trackProc*(s: PSym, body: PNode) =
|
||||
message(s.info, warnLockLevel,
|
||||
"declared lock level is $1, but real lock level is $2" %
|
||||
[$s.typ.lockLevel, $t.maxLockLevel])
|
||||
when false:
|
||||
when defined(useDfa):
|
||||
if s.kind == skFunc:
|
||||
when defined(dfa): dataflowAnalysis(s, body)
|
||||
trackWrites(s, body)
|
||||
dataflowAnalysis(s, body)
|
||||
when false: trackWrites(s, body)
|
||||
|
||||
proc trackTopLevelStmt*(module: PSym; n: PNode) =
|
||||
if n.kind in {nkPragma, nkMacroDef, nkTemplateDef, nkProcDef, nkFuncDef,
|
||||
|
||||
@@ -97,27 +97,12 @@ template semProcvarCheck(c: PContext, n: PNode) =
|
||||
|
||||
proc semProc(c: PContext, n: PNode): PNode
|
||||
|
||||
include semdestruct
|
||||
|
||||
proc semDestructorCheck(c: PContext, n: PNode, flags: TExprFlags) {.inline.} =
|
||||
if not newDestructors:
|
||||
if efAllowDestructor notin flags and
|
||||
n.kind in nkCallKinds+{nkObjConstr,nkBracket}:
|
||||
if instantiateDestructor(c, n.typ) != nil:
|
||||
localError(n.info, warnDestructor)
|
||||
# This still breaks too many things:
|
||||
when false:
|
||||
if efDetermineType notin flags and n.typ.kind == tyTypeDesc and
|
||||
c.p.owner.kind notin {skTemplate, skMacro}:
|
||||
localError(n.info, errGenerated, "value expected, but got a type")
|
||||
|
||||
proc semExprBranch(c: PContext, n: PNode): PNode =
|
||||
result = semExpr(c, n)
|
||||
if result.typ != nil:
|
||||
# XXX tyGenericInst here?
|
||||
semProcvarCheck(c, result)
|
||||
if result.typ.kind == tyVar: result = newDeref(result)
|
||||
semDestructorCheck(c, result, {})
|
||||
|
||||
proc semExprBranchScope(c: PContext, n: PNode): PNode =
|
||||
openScope(c)
|
||||
@@ -180,14 +165,14 @@ proc semIf(c: PContext, n: PNode): PNode =
|
||||
it.sons[0] = forceBool(c, semExprWithType(c, it.sons[0]))
|
||||
when not newScopeForIf: openScope(c)
|
||||
it.sons[1] = semExprBranch(c, it.sons[1])
|
||||
typ = commonType(typ, it.sons[1].typ)
|
||||
typ = commonType(typ, it.sons[1])
|
||||
closeScope(c)
|
||||
elif it.len == 1:
|
||||
hasElse = true
|
||||
it.sons[0] = semExprBranchScope(c, it.sons[0])
|
||||
typ = commonType(typ, it.sons[0].typ)
|
||||
typ = commonType(typ, it.sons[0])
|
||||
else: illFormedAst(it)
|
||||
if isEmptyType(typ) or typ.kind == tyNil or not hasElse:
|
||||
if isEmptyType(typ) or typ.kind in {tyNil, tyExpr} or not hasElse:
|
||||
for it in n: discardCheck(c, it.lastSon)
|
||||
result.kind = nkIfStmt
|
||||
# propagate any enforced VoidContext:
|
||||
@@ -195,7 +180,8 @@ proc semIf(c: PContext, n: PNode): PNode =
|
||||
else:
|
||||
for it in n:
|
||||
let j = it.len-1
|
||||
it.sons[j] = fitNode(c, typ, it.sons[j], it.sons[j].info)
|
||||
if not endsInNoReturn(it.sons[j]):
|
||||
it.sons[j] = fitNode(c, typ, it.sons[j], it.sons[j].info)
|
||||
result.kind = nkIfExpr
|
||||
result.typ = typ
|
||||
|
||||
@@ -228,7 +214,7 @@ proc semCase(c: PContext, n: PNode): PNode =
|
||||
semCaseBranch(c, n, x, i, covered)
|
||||
var last = sonsLen(x)-1
|
||||
x.sons[last] = semExprBranchScope(c, x.sons[last])
|
||||
typ = commonType(typ, x.sons[last].typ)
|
||||
typ = commonType(typ, x.sons[last])
|
||||
of nkElifBranch:
|
||||
chckCovered = false
|
||||
checkSonsLen(x, 2)
|
||||
@@ -236,13 +222,13 @@ proc semCase(c: PContext, n: PNode): PNode =
|
||||
x.sons[0] = forceBool(c, semExprWithType(c, x.sons[0]))
|
||||
when not newScopeForIf: openScope(c)
|
||||
x.sons[1] = semExprBranch(c, x.sons[1])
|
||||
typ = commonType(typ, x.sons[1].typ)
|
||||
typ = commonType(typ, x.sons[1])
|
||||
closeScope(c)
|
||||
of nkElse:
|
||||
chckCovered = false
|
||||
checkSonsLen(x, 1)
|
||||
x.sons[0] = semExprBranchScope(c, x.sons[0])
|
||||
typ = commonType(typ, x.sons[0].typ)
|
||||
typ = commonType(typ, x.sons[0])
|
||||
hasElse = true
|
||||
else:
|
||||
illFormedAst(x)
|
||||
@@ -252,7 +238,7 @@ proc semCase(c: PContext, n: PNode): PNode =
|
||||
else:
|
||||
localError(n.info, errNotAllCasesCovered)
|
||||
closeScope(c)
|
||||
if isEmptyType(typ) or typ.kind == tyNil or not hasElse:
|
||||
if isEmptyType(typ) or typ.kind in {tyNil, tyExpr} or not hasElse:
|
||||
for i in 1..n.len-1: discardCheck(c, n.sons[i].lastSon)
|
||||
# propagate any enforced VoidContext:
|
||||
if typ == enforceVoidContext:
|
||||
@@ -261,7 +247,8 @@ proc semCase(c: PContext, n: PNode): PNode =
|
||||
for i in 1..n.len-1:
|
||||
var it = n.sons[i]
|
||||
let j = it.len-1
|
||||
it.sons[j] = fitNode(c, typ, it.sons[j], it.sons[j].info)
|
||||
if not endsInNoReturn(it.sons[j]):
|
||||
it.sons[j] = fitNode(c, typ, it.sons[j], it.sons[j].info)
|
||||
result.typ = typ
|
||||
|
||||
proc semTry(c: PContext, n: PNode): PNode =
|
||||
@@ -421,15 +408,6 @@ proc addToVarSection(c: PContext; result: var PNode; orig, identDefs: PNode) =
|
||||
else:
|
||||
result.add identDefs
|
||||
|
||||
proc addDefer(c: PContext; result: var PNode; s: PSym) =
|
||||
let deferDestructorCall = createDestructorCall(c, s)
|
||||
if deferDestructorCall != nil:
|
||||
if result.kind != nkStmtList:
|
||||
let oldResult = result
|
||||
result = newNodeI(nkStmtList, result.info)
|
||||
result.add oldResult
|
||||
result.add deferDestructorCall
|
||||
|
||||
proc isDiscardUnderscore(v: PSym): bool =
|
||||
if v.name.s == "_":
|
||||
v.flags.incl(sfGenSym)
|
||||
@@ -465,9 +443,10 @@ proc hasEmpty(typ: PType): bool =
|
||||
result = result or hasEmpty(s)
|
||||
|
||||
proc makeDeref(n: PNode): PNode =
|
||||
var t = skipTypes(n.typ, {tyGenericInst, tyAlias})
|
||||
var t = n.typ
|
||||
if t.kind in tyUserTypeClasses and t.isResolvedUserTypeClass:
|
||||
t = t.lastSon
|
||||
t = skipTypes(t, {tyGenericInst, tyAlias})
|
||||
result = n
|
||||
if t.kind == tyVar:
|
||||
result = newNodeIT(nkHiddenDeref, n.info, t.sons[0])
|
||||
@@ -553,6 +532,7 @@ proc semVarOrLet(c: PContext, n: PNode, symkind: TSymKind): PNode =
|
||||
# this can only happen for errornous var statements:
|
||||
if typ == nil: continue
|
||||
typeAllowedCheck(a.info, typ, symkind)
|
||||
liftTypeBoundOps(c, typ, a.info)
|
||||
var tup = skipTypes(typ, {tyGenericInst, tyAlias})
|
||||
if a.kind == nkVarTuple:
|
||||
if tup.kind != tyTuple:
|
||||
@@ -608,7 +588,6 @@ proc semVarOrLet(c: PContext, n: PNode, symkind: TSymKind): PNode =
|
||||
if def.kind == nkPar: v.ast = def[j]
|
||||
setVarType(v, tup.sons[j])
|
||||
b.sons[j] = newSymNode(v)
|
||||
if not newDestructors: addDefer(c, result, v)
|
||||
checkNilable(v)
|
||||
if sfCompileTime in v.flags: hasCompileTime = true
|
||||
if hasCompileTime: vm.setupCompileTimeVar(c.module, c.cache, result)
|
||||
@@ -696,7 +675,9 @@ proc semForVars(c: PContext, n: PNode): PNode =
|
||||
if sfGenSym notin v.flags and not isDiscardUnderscore(v):
|
||||
addForVarDecl(c, v)
|
||||
inc(c.p.nestedLoopCounter)
|
||||
openScope(c)
|
||||
n.sons[length-1] = semStmt(c, n.sons[length-1])
|
||||
closeScope(c)
|
||||
dec(c.p.nestedLoopCounter)
|
||||
|
||||
proc implicitIterator(c: PContext, it: string, arg: PNode): PNode =
|
||||
@@ -752,6 +733,16 @@ proc semRaise(c: PContext, n: PNode): PNode =
|
||||
if typ.kind != tyRef or typ.lastSon.kind != tyObject:
|
||||
localError(n.info, errExprCannotBeRaised)
|
||||
|
||||
# check if the given object inherits from Exception
|
||||
var base = typ.lastSon
|
||||
while true:
|
||||
if base.sym.magic == mException:
|
||||
break
|
||||
if base.lastSon == nil:
|
||||
localError(n.info, "raised object of type $1 does not inherit from Exception", [typ.sym.name.s])
|
||||
return
|
||||
base = base.lastSon
|
||||
|
||||
proc addGenericParamListToScope(c: PContext, n: PNode) =
|
||||
if n.kind != nkGenericParams: illFormedAst(n)
|
||||
for i in countup(0, sonsLen(n)-1):
|
||||
@@ -774,24 +765,55 @@ proc typeSectionLeftSidePass(c: PContext, n: PNode) =
|
||||
checkSonsLen(a, 3)
|
||||
let name = a.sons[0]
|
||||
var s: PSym
|
||||
if name.kind == nkDotExpr:
|
||||
s = qualifiedLookUp(c, name, {checkUndeclared, checkModule})
|
||||
if s.kind != skType or
|
||||
s.typ.skipTypes(abstractPtrs).kind != tyObject or
|
||||
tfPartial notin s.typ.skipTypes(abstractPtrs).flags:
|
||||
localError(name.info, "only .partial objects can be extended")
|
||||
if name.kind == nkDotExpr and a[2].kind == nkObjectTy:
|
||||
let pkgName = considerQuotedIdent(name[0])
|
||||
let typName = considerQuotedIdent(name[1])
|
||||
let pkg = c.graph.packageSyms.strTableGet(pkgName)
|
||||
if pkg.isNil or pkg.kind != skPackage:
|
||||
localError(name.info, "unknown package name: " & pkgName.s)
|
||||
else:
|
||||
let typsym = pkg.tab.strTableGet(typName)
|
||||
if typsym.isNil:
|
||||
s = semIdentDef(c, name[1], skType)
|
||||
s.typ = newTypeS(tyObject, c)
|
||||
s.typ.sym = s
|
||||
s.flags.incl sfForward
|
||||
pkg.tab.strTableAdd s
|
||||
addInterfaceDecl(c, s)
|
||||
elif typsym.kind == skType and sfForward in typsym.flags:
|
||||
s = typsym
|
||||
addInterfaceDecl(c, s)
|
||||
else:
|
||||
localError(name.info, typsym.name.s & " is not a type that can be forwarded")
|
||||
s = typsym
|
||||
else:
|
||||
s = semIdentDef(c, name, skType)
|
||||
s.typ = newTypeS(tyForward, c)
|
||||
s.typ.sym = s # process pragmas:
|
||||
if name.kind == nkPragmaExpr:
|
||||
pragma(c, s, name.sons[1], typePragmas)
|
||||
if sfForward in s.flags:
|
||||
# check if the symbol already exists:
|
||||
let pkg = c.module.owner
|
||||
if not isTopLevel(c) or pkg.isNil:
|
||||
localError(name.info, "only top level types in a package can be 'package'")
|
||||
else:
|
||||
let typsym = pkg.tab.strTableGet(s.name)
|
||||
if typsym != nil:
|
||||
if sfForward notin typsym.flags or sfNoForward notin typsym.flags:
|
||||
typeCompleted(typsym)
|
||||
typsym.info = s.info
|
||||
else:
|
||||
localError(name.info, "cannot complete type '" & s.name.s & "' twice; " &
|
||||
"previous type completion was here: " & $typsym.info)
|
||||
s = typsym
|
||||
# add it here, so that recursive types are possible:
|
||||
if sfGenSym notin s.flags: addInterfaceDecl(c, s)
|
||||
|
||||
a.sons[0] = newSymNode(s)
|
||||
|
||||
proc checkCovariantParamsUsages(genericType: PType) =
|
||||
var body = genericType{-1}
|
||||
var body = genericType[^1]
|
||||
|
||||
proc traverseSubTypes(t: PType): bool =
|
||||
template error(msg) = localError(genericType.sym.info, msg)
|
||||
@@ -826,7 +848,7 @@ proc checkCovariantParamsUsages(genericType: PType) =
|
||||
|
||||
of tyGenericInvocation:
|
||||
let targetBody = t[0]
|
||||
for i in 1 .. <t.len:
|
||||
for i in 1 ..< t.len:
|
||||
let param = t[i]
|
||||
if param.kind == tyGenericParam:
|
||||
if tfCovariant in param.flags:
|
||||
@@ -972,8 +994,8 @@ proc checkForMetaFields(n: PNode) =
|
||||
case t.kind
|
||||
of tySequence, tySet, tyArray, tyOpenArray, tyVar, tyPtr, tyRef,
|
||||
tyProc, tyGenericInvocation, tyGenericInst, tyAlias:
|
||||
let start = ord(t.kind in {tyGenericInvocation, tyGenericInst})
|
||||
for i in start .. <t.sons.len:
|
||||
let start = int ord(t.kind in {tyGenericInvocation, tyGenericInst})
|
||||
for i in start ..< t.sons.len:
|
||||
checkMeta(t.sons[i])
|
||||
else:
|
||||
checkMeta(t)
|
||||
@@ -1007,6 +1029,8 @@ proc typeSectionFinalPass(c: PContext, n: PNode) =
|
||||
checkConstructedType(s.info, s.typ)
|
||||
if s.typ.kind in {tyObject, tyTuple} and not s.typ.n.isNil:
|
||||
checkForMetaFields(s.typ.n)
|
||||
instAllTypeBoundOp(c, n.info)
|
||||
|
||||
|
||||
proc semAllTypeSections(c: PContext; n: PNode): PNode =
|
||||
proc gatherStmts(c: PContext; n: PNode; result: PNode) {.nimcall.} =
|
||||
@@ -1061,9 +1085,11 @@ proc semTypeSection(c: PContext, n: PNode): PNode =
|
||||
## to allow the type definitions in the section to reference each other
|
||||
## without regard for the order of their definitions.
|
||||
if sfNoForward notin c.module.flags or nfSem notin n.flags:
|
||||
inc c.inTypeContext
|
||||
typeSectionLeftSidePass(c, n)
|
||||
typeSectionRightSidePass(c, n)
|
||||
typeSectionFinalPass(c, n)
|
||||
dec c.inTypeContext
|
||||
result = n
|
||||
|
||||
proc semParamList(c: PContext, n, genericParams: PNode, s: PSym) =
|
||||
@@ -1099,7 +1125,7 @@ proc addResultNode(c: PContext, n: PNode) =
|
||||
|
||||
proc copyExcept(n: PNode, i: int): PNode =
|
||||
result = copyNode(n)
|
||||
for j in 0.. <n.len:
|
||||
for j in 0..<n.len:
|
||||
if j != i: result.add(n.sons[j])
|
||||
|
||||
proc lookupMacro(c: PContext, n: PNode): PSym =
|
||||
@@ -1113,7 +1139,7 @@ proc semProcAnnotation(c: PContext, prc: PNode;
|
||||
validPragmas: TSpecialWords): PNode =
|
||||
var n = prc.sons[pragmasPos]
|
||||
if n == nil or n.kind == nkEmpty: return
|
||||
for i in countup(0, <n.len):
|
||||
for i in countup(0, n.len-1):
|
||||
var it = n.sons[i]
|
||||
var key = if it.kind == nkExprColonExpr: it.sons[0] else: it
|
||||
let m = lookupMacro(c, key)
|
||||
@@ -1264,7 +1290,7 @@ proc activate(c: PContext, n: PNode) =
|
||||
of nkLambdaKinds:
|
||||
discard semLambda(c, n, {})
|
||||
of nkCallKinds:
|
||||
for i in 1 .. <n.len: activate(c, n[i])
|
||||
for i in 1 ..< n.len: activate(c, n[i])
|
||||
else:
|
||||
discard
|
||||
|
||||
@@ -1284,7 +1310,7 @@ proc semOverride(c: PContext, s: PSym, n: PNode) =
|
||||
var obj = t.sons[1].sons[0]
|
||||
while true:
|
||||
incl(obj.flags, tfHasAsgn)
|
||||
if obj.kind == tyGenericBody: obj = obj.lastSon
|
||||
if obj.kind in {tyGenericBody, tyGenericInst}: obj = obj.lastSon
|
||||
elif obj.kind == tyGenericInvocation: obj = obj.sons[0]
|
||||
else: break
|
||||
if obj.kind in {tyObject, tyDistinct}:
|
||||
@@ -1294,13 +1320,9 @@ proc semOverride(c: PContext, s: PSym, n: PNode) =
|
||||
localError(n.info, errGenerated,
|
||||
"cannot bind another '" & s.name.s & "' to: " & typeToString(obj))
|
||||
noError = true
|
||||
if not noError:
|
||||
if not noError and sfSystemModule notin s.owner.flags:
|
||||
localError(n.info, errGenerated,
|
||||
"signature for '" & s.name.s & "' must be proc[T: object](x: var T)")
|
||||
else:
|
||||
doDestructorStuff(c, s, n)
|
||||
if not experimentalMode(c):
|
||||
localError n.info, "use the {.experimental.} pragma to enable destructors"
|
||||
incl(s.flags, sfUsed)
|
||||
of "deepcopy", "=deepcopy":
|
||||
if s.typ.len == 2 and
|
||||
@@ -1350,8 +1372,9 @@ proc semOverride(c: PContext, s: PSym, n: PNode) =
|
||||
localError(n.info, errGenerated,
|
||||
"cannot bind another '" & s.name.s & "' to: " & typeToString(obj))
|
||||
return
|
||||
localError(n.info, errGenerated,
|
||||
"signature for '" & s.name.s & "' must be proc[T: object](x: var T; y: T)")
|
||||
if sfSystemModule notin s.owner.flags:
|
||||
localError(n.info, errGenerated,
|
||||
"signature for '" & s.name.s & "' must be proc[T: object](x: var T; y: T)")
|
||||
else:
|
||||
if sfOverriden in s.flags:
|
||||
localError(n.info, errGenerated,
|
||||
@@ -1526,8 +1549,11 @@ proc semProcAux(c: PContext, n: PNode, kind: TSymKind,
|
||||
s.options = gOptions
|
||||
if sfOverriden in s.flags or s.name.s[0] == '=': semOverride(c, s, n)
|
||||
if s.name.s[0] in {'.', '('}:
|
||||
if s.name.s in [".", ".()", ".=", "()"] and not experimentalMode(c):
|
||||
if s.name.s in [".", ".()", ".="] and not experimentalMode(c) and not newDestructors:
|
||||
message(n.info, warnDeprecated, "overloaded '.' and '()' operators are now .experimental; " & s.name.s)
|
||||
elif s.name.s == "()" and not experimentalMode(c):
|
||||
message(n.info, warnDeprecated, "overloaded '()' operators are now .experimental; " & s.name.s)
|
||||
|
||||
if n.sons[bodyPos].kind != nkEmpty:
|
||||
# for DLL generation it is annoying to check for sfImportc!
|
||||
if sfBorrow in s.flags:
|
||||
@@ -1686,7 +1712,7 @@ proc evalInclude(c: PContext, n: PNode): PNode =
|
||||
excl(c.includedFiles, f)
|
||||
|
||||
proc setLine(n: PNode, info: TLineInfo) =
|
||||
for i in 0 .. <safeLen(n): setLine(n.sons[i], info)
|
||||
for i in 0 ..< safeLen(n): setLine(n.sons[i], info)
|
||||
n.info = info
|
||||
|
||||
proc semPragmaBlock(c: PContext, n: PNode): PNode =
|
||||
@@ -1694,7 +1720,7 @@ proc semPragmaBlock(c: PContext, n: PNode): PNode =
|
||||
pragma(c, nil, pragmaList, exprPragmas)
|
||||
result = semExpr(c, n.sons[1])
|
||||
n.sons[1] = result
|
||||
for i in 0 .. <pragmaList.len:
|
||||
for i in 0 ..< pragmaList.len:
|
||||
case whichPragma(pragmaList.sons[i])
|
||||
of wLine: setLine(result, pragmaList.sons[i].info)
|
||||
of wLocks, wGcSafe:
|
||||
@@ -1827,11 +1853,12 @@ proc semStmtList(c: PContext, n: PNode, flags: TExprFlags): PNode =
|
||||
else:
|
||||
n.typ = n.sons[i].typ
|
||||
if not isEmptyType(n.typ): n.kind = nkStmtListExpr
|
||||
case n.sons[i].kind
|
||||
of LastBlockStmts:
|
||||
if n.sons[i].kind in LastBlockStmts or
|
||||
n.sons[i].kind in nkCallKinds and n.sons[i][0].kind == nkSym and sfNoReturn in n.sons[i][0].sym.flags:
|
||||
for j in countup(i + 1, length - 1):
|
||||
case n.sons[j].kind
|
||||
of nkPragma, nkCommentStmt, nkNilLit, nkEmpty: discard
|
||||
of nkPragma, nkCommentStmt, nkNilLit, nkEmpty, nkBlockExpr,
|
||||
nkBlockStmt, nkState: discard
|
||||
else: localError(n.sons[j].info, errStmtInvalidAfterReturn)
|
||||
else: discard
|
||||
|
||||
|
||||
@@ -75,7 +75,7 @@ proc symChoice(c: PContext, n: PNode, s: PSym, r: TSymChoiceRule): PNode =
|
||||
a = nextOverloadIter(o, c, n)
|
||||
|
||||
proc semBindStmt(c: PContext, n: PNode, toBind: var IntSet): PNode =
|
||||
for i in 0 .. < n.len:
|
||||
for i in 0 ..< n.len:
|
||||
var a = n.sons[i]
|
||||
# If 'a' is an overloaded symbol, we used to use the first symbol
|
||||
# as a 'witness' and use the fact that subsequent lookups will yield
|
||||
@@ -95,7 +95,7 @@ proc semBindStmt(c: PContext, n: PNode, toBind: var IntSet): PNode =
|
||||
result = newNodeI(nkEmpty, n.info)
|
||||
|
||||
proc semMixinStmt(c: PContext, n: PNode, toMixin: var IntSet): PNode =
|
||||
for i in 0 .. < n.len:
|
||||
for i in 0 ..< n.len:
|
||||
toMixin.incl(considerQuotedIdent(n.sons[i]).id)
|
||||
result = newNodeI(nkEmpty, n.info)
|
||||
|
||||
@@ -113,13 +113,9 @@ type
|
||||
owner: PSym
|
||||
cursorInBody: bool # only for nimsuggest
|
||||
scopeN: int
|
||||
bracketExpr: PNode
|
||||
|
||||
template withBracketExpr(ctx, x, body: untyped) =
|
||||
let old = ctx.bracketExpr
|
||||
ctx.bracketExpr = x
|
||||
body
|
||||
ctx.bracketExpr = old
|
||||
|
||||
proc getIdentNode(c: var TemplCtx, n: PNode): PNode =
|
||||
case n.kind
|
||||
@@ -163,7 +159,7 @@ proc onlyReplaceParams(c: var TemplCtx, n: PNode): PNode =
|
||||
result = newSymNode(s, n.info)
|
||||
styleCheckUse(n.info, s)
|
||||
else:
|
||||
for i in 0 .. <n.safeLen:
|
||||
for i in 0 ..< n.safeLen:
|
||||
result.sons[i] = onlyReplaceParams(c, n.sons[i])
|
||||
|
||||
proc newGenSym(kind: TSymKind, n: PNode, c: var TemplCtx): PSym =
|
||||
@@ -301,21 +297,9 @@ proc semPattern(c: PContext, n: PNode): PNode
|
||||
|
||||
proc semTemplBodySons(c: var TemplCtx, n: PNode): PNode =
|
||||
result = n
|
||||
for i in 0.. < n.len:
|
||||
for i in 0 ..< n.len:
|
||||
result.sons[i] = semTemplBody(c, n.sons[i])
|
||||
|
||||
proc oprIsRoof(n: PNode): bool =
|
||||
const roof = "^"
|
||||
case n.kind
|
||||
of nkIdent: result = n.ident.s == roof
|
||||
of nkSym: result = n.sym.name.s == roof
|
||||
of nkAccQuoted:
|
||||
if n.len == 1:
|
||||
result = oprIsRoof(n.sons[0])
|
||||
of nkOpenSymChoice, nkClosedSymChoice:
|
||||
result = oprIsRoof(n.sons[0])
|
||||
else: discard
|
||||
|
||||
proc semTemplBody(c: var TemplCtx, n: PNode): PNode =
|
||||
result = n
|
||||
semIdeForTemplateOrGenericCheck(n, c.cursorInBody)
|
||||
@@ -347,7 +331,7 @@ proc semTemplBody(c: var TemplCtx, n: PNode): PNode =
|
||||
of nkMixinStmt:
|
||||
if c.scopeN > 0: result = semTemplBodySons(c, n)
|
||||
else: result = semMixinStmt(c.c, n, c.toMixin)
|
||||
of nkEmpty, nkSym..nkNilLit:
|
||||
of nkEmpty, nkSym..nkNilLit, nkComesFrom:
|
||||
discard
|
||||
of nkIfStmt:
|
||||
for i in countup(0, sonsLen(n)-1):
|
||||
@@ -382,8 +366,10 @@ proc semTemplBody(c: var TemplCtx, n: PNode): PNode =
|
||||
n.sons[L-2] = semTemplBody(c, n.sons[L-2])
|
||||
for i in countup(0, L - 3):
|
||||
addLocalDecl(c, n.sons[i], skForVar)
|
||||
openScope(c)
|
||||
n.sons[L-1] = semTemplBody(c, n.sons[L-1])
|
||||
closeScope(c)
|
||||
closeScope(c)
|
||||
of nkBlockStmt, nkBlockExpr, nkBlockType:
|
||||
checkSonsLen(n, 2)
|
||||
openScope(c)
|
||||
@@ -506,8 +492,6 @@ proc semTemplBody(c: var TemplCtx, n: PNode): PNode =
|
||||
result = semTemplBodySons(c, n)
|
||||
of nkCallKinds-{nkPostfix}:
|
||||
result = semTemplBodySons(c, n)
|
||||
if c.bracketExpr != nil and n.len == 2 and oprIsRoof(n.sons[0]):
|
||||
result.add c.bracketExpr
|
||||
of nkDotExpr, nkAccQuoted:
|
||||
# dotExpr is ambiguous: note that we explicitly allow 'x.TemplateParam',
|
||||
# so we use the generic code for nkDotExpr too
|
||||
@@ -544,7 +528,7 @@ proc semTemplBodyDirty(c: var TemplCtx, n: PNode): PNode =
|
||||
result = semTemplBodyDirty(c, n.sons[0])
|
||||
of nkBindStmt:
|
||||
result = semBindStmt(c.c, n, c.toBind)
|
||||
of nkEmpty, nkSym..nkNilLit:
|
||||
of nkEmpty, nkSym..nkNilLit, nkComesFrom:
|
||||
discard
|
||||
else:
|
||||
# dotExpr is ambiguous: note that we explicitly allow 'x.TemplateParam',
|
||||
|
||||
@@ -138,7 +138,7 @@ proc semAnyRef(c: PContext; n: PNode; kind: TTypeKind; prev: PType): PType =
|
||||
if n.len < 1:
|
||||
result = newConstraint(c, kind)
|
||||
else:
|
||||
let isCall = ord(n.kind in nkCallKinds+{nkBracketExpr})
|
||||
let isCall = int ord(n.kind in nkCallKinds+{nkBracketExpr})
|
||||
let n = if n[0].kind == nkBracket: n[0] else: n
|
||||
checkMinSonsLen(n, 1)
|
||||
var t = semTypeNode(c, n.lastSon, nil)
|
||||
@@ -235,7 +235,10 @@ proc semRange(c: PContext, n: PNode, prev: PType): PType =
|
||||
n.sons[1].floatVal < 0.0:
|
||||
incl(result.flags, tfNeedsInit)
|
||||
else:
|
||||
localError(n.sons[0].info, errRangeExpected)
|
||||
if n[1].kind == nkInfix and considerQuotedIdent(n[1][0]).s == "..<":
|
||||
localError(n[0].info, "range types need to be constructed with '..', '..<' is not supported")
|
||||
else:
|
||||
localError(n.sons[0].info, errRangeExpected)
|
||||
result = newOrPrevType(tyError, prev, c)
|
||||
else:
|
||||
localError(n.info, errXExpectsOneTypeParam, "range")
|
||||
@@ -293,7 +296,9 @@ proc semArray(c: PContext, n: PNode, prev: PType): PType =
|
||||
base = semTypeNode(c, n.sons[2], nil)
|
||||
# ensure we only construct a tyArray when there was no error (bug #3048):
|
||||
result = newOrPrevType(tyArray, prev, c)
|
||||
addSonSkipIntLit(result, indx)
|
||||
# bug #6682: Do not propagate initialization requirements etc for the
|
||||
# index type:
|
||||
rawAddSonNoPropagationOfTypeFlags(result, indx)
|
||||
addSonSkipIntLit(result, base)
|
||||
else:
|
||||
localError(n.info, errArrayExpectsTwoTypeParams)
|
||||
@@ -315,11 +320,8 @@ proc semTypeIdent(c: PContext, n: PNode): PSym =
|
||||
if n.kind == nkSym:
|
||||
result = getGenSym(c, n.sym)
|
||||
else:
|
||||
when defined(nimfix):
|
||||
result = pickSym(c, n, skType)
|
||||
if result.isNil:
|
||||
result = qualifiedLookUp(c, n, {checkAmbiguity, checkUndeclared})
|
||||
else:
|
||||
result = pickSym(c, n, {skType, skGenericParam})
|
||||
if result.isNil:
|
||||
result = qualifiedLookUp(c, n, {checkAmbiguity, checkUndeclared})
|
||||
if result != nil:
|
||||
markUsed(n.info, result, c.graph.usageSym)
|
||||
@@ -515,7 +517,7 @@ proc semCaseBranch(c: PContext, t, branch: PNode, branchIndex: int,
|
||||
# first element is special and will overwrite: branch.sons[i]:
|
||||
branch.sons[i] = semCaseBranchSetElem(c, t, r[0], covered)
|
||||
# other elements have to be added to ``branch``
|
||||
for j in 1 .. <r.len:
|
||||
for j in 1 ..< r.len:
|
||||
branch.add(semCaseBranchSetElem(c, t, r[j], covered))
|
||||
# caution! last son of branch must be the actions to execute:
|
||||
var L = branch.len
|
||||
@@ -846,7 +848,7 @@ proc liftParamType(c: PContext, procKind: TSymKind, genericParams: PNode,
|
||||
@[newTypeS(paramType.kind, c)])
|
||||
result = addImplicitGeneric(typ)
|
||||
else:
|
||||
for i in 0 .. <paramType.len:
|
||||
for i in 0 ..< paramType.len:
|
||||
if paramType.sons[i] == paramType:
|
||||
globalError(info, errIllegalRecursionInTypeX, typeToString(paramType))
|
||||
var lifted = liftingWalk(paramType.sons[i])
|
||||
@@ -897,7 +899,7 @@ proc liftParamType(c: PContext, procKind: TSymKind, genericParams: PNode,
|
||||
result.shouldHaveMeta
|
||||
|
||||
of tyGenericInvocation:
|
||||
for i in 1 .. <paramType.len:
|
||||
for i in 1 ..< paramType.len:
|
||||
let lifted = liftingWalk(paramType.sons[i])
|
||||
if lifted != nil: paramType.sons[i] = lifted
|
||||
|
||||
@@ -1045,6 +1047,12 @@ proc semProcTypeNode(c: PContext, n, genericParams: PNode,
|
||||
result.flags.incl tfIterator
|
||||
# XXX Would be nice if we could get rid of this
|
||||
result.sons[0] = r
|
||||
let oldFlags = result.flags
|
||||
propagateToOwner(result, r)
|
||||
if oldFlags != result.flags:
|
||||
# XXX This rather hacky way keeps 'tflatmap' compiling:
|
||||
if tfHasMeta notin oldFlags:
|
||||
result.flags.excl tfHasMeta
|
||||
result.n.typ = r
|
||||
|
||||
if genericParams != nil and genericParams.len > 0:
|
||||
@@ -1146,7 +1154,7 @@ proc semGeneric(c: PContext, n: PNode, s: PSym, prev: PType): PType =
|
||||
|
||||
var isConcrete = true
|
||||
|
||||
for i in 1 .. <m.call.len:
|
||||
for i in 1 ..< m.call.len:
|
||||
var typ = m.call[i].typ
|
||||
if typ.kind == tyTypeDesc and typ.sons[0].kind == tyNone:
|
||||
isConcrete = false
|
||||
@@ -1167,7 +1175,10 @@ proc semGeneric(c: PContext, n: PNode, s: PSym, prev: PType): PType =
|
||||
|
||||
# special check for generic object with
|
||||
# generic/partial specialized parent
|
||||
let tx = result.skipTypes(abstractPtrs)
|
||||
let tx = result.skipTypes(abstractPtrs, 50)
|
||||
if tx.isNil:
|
||||
localError(n.info, "invalid recursion in type '$1'" % typeToString(result[0]))
|
||||
return errorType(c)
|
||||
if tx != result and tx.kind == tyObject and tx.sons[0] != nil:
|
||||
semObjectTypeForInheritedGenericInst(c, n, tx)
|
||||
|
||||
@@ -1215,8 +1226,6 @@ template modifierTypeKindOfNode(n: PNode): TTypeKind =
|
||||
|
||||
proc semTypeClass(c: PContext, n: PNode, prev: PType): PType =
|
||||
# if n.sonsLen == 0: return newConstraint(c, tyTypeClass)
|
||||
if nfBase2 in n.flags:
|
||||
message(n.info, warnDeprecated, "use 'concept' instead; 'generic'")
|
||||
let
|
||||
pragmas = n[1]
|
||||
inherited = n[2]
|
||||
@@ -1295,8 +1304,7 @@ proc symFromExpectedTypeNode(c: PContext, n: PNode): PSym =
|
||||
|
||||
proc semTypeNode(c: PContext, n: PNode, prev: PType): PType =
|
||||
result = nil
|
||||
when defined(nimsuggest):
|
||||
inc c.inTypeContext
|
||||
inc c.inTypeContext
|
||||
|
||||
if gCmd == cmdIdeTools: suggestExpr(c, n)
|
||||
case n.kind
|
||||
@@ -1355,7 +1363,7 @@ proc semTypeNode(c: PContext, n: PNode, prev: PType): PType =
|
||||
case n.len
|
||||
of 3:
|
||||
result = semTypeNode(c, n.sons[1], prev)
|
||||
if result.skipTypes({tyGenericInst, tyAlias}).kind in NilableTypes+GenericTypes and
|
||||
if result.skipTypes({tyGenericInst, tyAlias}).kind in NilableTypes+GenericTypes+{tyForward} and
|
||||
n.sons[2].kind == nkNilLit:
|
||||
result = freshType(result, prev)
|
||||
result.flags.incl(tfNotNil)
|
||||
@@ -1418,9 +1426,13 @@ proc semTypeNode(c: PContext, n: PNode, prev: PType): PType =
|
||||
else: result = semGeneric(c, n, s, prev)
|
||||
of nkDotExpr:
|
||||
let typeExpr = semExpr(c, n)
|
||||
if typeExpr.typ.kind == tyFromExpr:
|
||||
return typeExpr.typ
|
||||
if typeExpr.typ.kind != tyTypeDesc:
|
||||
if typeExpr.typ.isNil:
|
||||
localError(n.info, "object constructor needs an object type;" &
|
||||
" for named arguments use '=' instead of ':'")
|
||||
result = errorType(c)
|
||||
elif typeExpr.typ.kind == tyFromExpr:
|
||||
result = typeExpr.typ
|
||||
elif typeExpr.typ.kind != tyTypeDesc:
|
||||
localError(n.info, errTypeExpected)
|
||||
result = errorType(c)
|
||||
else:
|
||||
@@ -1511,8 +1523,13 @@ proc semTypeNode(c: PContext, n: PNode, prev: PType): PType =
|
||||
localError(n.info, errTypeExpected)
|
||||
result = newOrPrevType(tyError, prev, c)
|
||||
n.typ = result
|
||||
when defined(nimsuggest):
|
||||
dec c.inTypeContext
|
||||
dec c.inTypeContext
|
||||
if c.inTypeContext == 0: instAllTypeBoundOp(c, n.info)
|
||||
|
||||
when false:
|
||||
proc semTypeNode(c: PContext, n: PNode, prev: PType): PType =
|
||||
result = semTypeNodeInner(c, n, prev)
|
||||
instAllTypeBoundOp(c, n.info)
|
||||
|
||||
proc setMagicType(m: PSym, kind: TTypeKind, size: int) =
|
||||
# source : https://en.wikipedia.org/wiki/Data_structure_alignment#x86
|
||||
@@ -1600,6 +1617,7 @@ proc processMagicType(c: PContext, m: PSym) =
|
||||
rawAddSon(m.typ, newTypeS(tyNone, c))
|
||||
of mPNimrodNode:
|
||||
incl m.typ.flags, tfTriggersCompileTime
|
||||
of mException: discard
|
||||
else: localError(m.info, errTypeExpected)
|
||||
|
||||
proc semGenericConstraints(c: PContext, x: PType): PType =
|
||||
@@ -1614,8 +1632,8 @@ proc semGenericParamList(c: PContext, n: PNode, father: PType = nil): PNode =
|
||||
var a = n.sons[i]
|
||||
if a.kind != nkIdentDefs: illFormedAst(n)
|
||||
let L = a.len
|
||||
var def = a{-1}
|
||||
let constraint = a{-2}
|
||||
var def = a[^1]
|
||||
let constraint = a[^2]
|
||||
var typ: PType
|
||||
|
||||
if constraint.kind != nkEmpty:
|
||||
|
||||
@@ -77,7 +77,7 @@ type
|
||||
topLayer*: TIdTable
|
||||
nextLayer*: ptr LayeredIdTable
|
||||
|
||||
TReplTypeVars* {.final.} = object
|
||||
TReplTypeVars* = object
|
||||
c*: PContext
|
||||
typeMap*: ptr LayeredIdTable # map PType to PType
|
||||
symMap*: TIdTable # map PSym to PSym
|
||||
@@ -133,7 +133,7 @@ proc prepareNode(cl: var TReplTypeVars, n: PNode): PNode =
|
||||
result.typ = t
|
||||
if result.kind == nkSym: result.sym = replaceTypeVarsS(cl, n.sym)
|
||||
let isCall = result.kind in nkCallKinds
|
||||
for i in 0 .. <n.safeLen:
|
||||
for i in 0 ..< n.safeLen:
|
||||
# XXX HACK: ``f(a, b)``, avoid to instantiate `f`
|
||||
if isCall and i == 0: result.add(n[i])
|
||||
else: result.add(prepareNode(cl, n[i]))
|
||||
@@ -151,7 +151,7 @@ proc hasGenericArguments*(n: PNode): bool =
|
||||
(n.sym.kind == skType and
|
||||
n.sym.typ.flags * {tfGenericTypeParam, tfImplicitTypeParam} != {})
|
||||
else:
|
||||
for i in 0.. <n.safeLen:
|
||||
for i in 0..<n.safeLen:
|
||||
if hasGenericArguments(n.sons[i]): return true
|
||||
return false
|
||||
|
||||
@@ -166,13 +166,13 @@ proc reResolveCallsWithTypedescParams(cl: var TReplTypeVars, n: PNode): PNode =
|
||||
# overload resolution is executed again (which may trigger generateInstance).
|
||||
if n.kind in nkCallKinds and sfFromGeneric in n[0].sym.flags:
|
||||
var needsFixing = false
|
||||
for i in 1 .. <n.safeLen:
|
||||
for i in 1 ..< n.safeLen:
|
||||
if isTypeParam(n[i]): needsFixing = true
|
||||
if needsFixing:
|
||||
n.sons[0] = newSymNode(n.sons[0].sym.owner)
|
||||
return cl.c.semOverloadedCall(cl.c, n, n, {skProc, skFunc}, {})
|
||||
|
||||
for i in 0 .. <n.safeLen:
|
||||
for i in 0 ..< n.safeLen:
|
||||
n.sons[i] = reResolveCallsWithTypedescParams(cl, n[i])
|
||||
|
||||
return n
|
||||
@@ -261,6 +261,17 @@ proc instCopyType*(cl: var TReplTypeVars, t: PType): PType =
|
||||
if not (t.kind in tyMetaTypes or
|
||||
(t.kind == tyStatic and t.n == nil)):
|
||||
result.flags.excl tfInstClearedFlags
|
||||
when false:
|
||||
if newDestructors:
|
||||
result.assignment = nil
|
||||
#result.destructor = nil
|
||||
result.sink = nil
|
||||
|
||||
template typeBound(c, newty, oldty, field, info) =
|
||||
let opr = newty.field
|
||||
if opr != nil and sfFromGeneric notin opr.flags:
|
||||
# '=' needs to be instantiated for generics when the type is constructed:
|
||||
newty.field = c.instTypeBoundOp(c, opr, oldty, info, attachedAsgn, 1)
|
||||
|
||||
proc handleGenericInvocation(cl: var TReplTypeVars, t: PType): PType =
|
||||
# tyGenericInvocation[A, tyGenericInvocation[A, B]]
|
||||
@@ -357,17 +368,10 @@ proc handleGenericInvocation(cl: var TReplTypeVars, t: PType): PType =
|
||||
assert newbody.kind in {tyRef, tyPtr}
|
||||
assert newbody.lastSon.typeInst == nil
|
||||
newbody.lastSon.typeInst = result
|
||||
template typeBound(field) =
|
||||
let opr = newbody.field
|
||||
if opr != nil and sfFromGeneric notin opr.flags:
|
||||
# '=' needs to be instantiated for generics when the type is constructed:
|
||||
newbody.field = cl.c.instTypeBoundOp(cl.c, opr, result, cl.info,
|
||||
attachedAsgn, 1)
|
||||
# we need to produce the destructor first here because generated '='
|
||||
# and '=sink' operators can rely on it:
|
||||
if newDestructors: typeBound(destructor)
|
||||
typeBound(assignment)
|
||||
typeBound(sink)
|
||||
if newDestructors:
|
||||
cl.c.typesWithOps.add((newbody, result))
|
||||
else:
|
||||
typeBound(cl.c, newbody, result, assignment, cl.info)
|
||||
let methods = skipTypes(bbody, abstractPtrs).methods
|
||||
for col, meth in items(methods):
|
||||
# we instantiate the known methods belonging to that type, this causes
|
||||
@@ -381,11 +385,11 @@ proc eraseVoidParams*(t: PType) =
|
||||
if t.sons[0] != nil and t.sons[0].kind == tyVoid:
|
||||
t.sons[0] = nil
|
||||
|
||||
for i in 1 .. <t.sonsLen:
|
||||
for i in 1 ..< t.sonsLen:
|
||||
# don't touch any memory unless necessary
|
||||
if t.sons[i].kind == tyVoid:
|
||||
var pos = i
|
||||
for j in i+1 .. <t.sonsLen:
|
||||
for j in i+1 ..< t.sonsLen:
|
||||
if t.sons[j].kind != tyVoid:
|
||||
t.sons[pos] = t.sons[j]
|
||||
t.n.sons[pos] = t.n.sons[j]
|
||||
@@ -395,7 +399,7 @@ proc eraseVoidParams*(t: PType) =
|
||||
return
|
||||
|
||||
proc skipIntLiteralParams*(t: PType) =
|
||||
for i in 0 .. <t.sonsLen:
|
||||
for i in 0 ..< t.sonsLen:
|
||||
let p = t.sons[i]
|
||||
if p == nil: continue
|
||||
let skipped = p.skipIntLit
|
||||
@@ -496,7 +500,7 @@ proc replaceTypeVarsTAux(cl: var TReplTypeVars, t: PType): PType =
|
||||
bailout()
|
||||
result = instCopyType(cl, t)
|
||||
idTablePut(cl.localCache, t, result)
|
||||
for i in 1 .. <result.sonsLen:
|
||||
for i in 1 ..< result.sonsLen:
|
||||
result.sons[i] = replaceTypeVarsT(cl, result.sons[i])
|
||||
propagateToOwner(result, result.lastSon)
|
||||
|
||||
@@ -518,7 +522,8 @@ proc replaceTypeVarsTAux(cl: var TReplTypeVars, t: PType): PType =
|
||||
if r2.kind in {tyPtr, tyRef}:
|
||||
r = skipTypes(r2, {tyPtr, tyRef})
|
||||
result.sons[i] = r
|
||||
propagateToOwner(result, r)
|
||||
if result.kind != tyArray or i != 0:
|
||||
propagateToOwner(result, r)
|
||||
# bug #4677: Do not instantiate effect lists
|
||||
result.n = replaceTypeVarsN(cl, result.n, ord(result.kind==tyProc))
|
||||
case result.kind
|
||||
@@ -535,6 +540,17 @@ proc replaceTypeVarsTAux(cl: var TReplTypeVars, t: PType): PType =
|
||||
|
||||
else: discard
|
||||
|
||||
proc instAllTypeBoundOp*(c: PContext, info: TLineInfo) =
|
||||
if not newDestructors: return
|
||||
var i = 0
|
||||
while i < c.typesWithOps.len:
|
||||
let (newty, oldty) = c.typesWithOps[i]
|
||||
typeBound(c, newty, oldty, destructor, info)
|
||||
typeBound(c, newty, oldty, sink, info)
|
||||
typeBound(c, newty, oldty, assignment, info)
|
||||
inc i
|
||||
setLen(c.typesWithOps, 0)
|
||||
|
||||
proc initTypeVars*(p: PContext, typeMap: ptr LayeredIdTable, info: TLineInfo;
|
||||
owner: PSym): TReplTypeVars =
|
||||
initIdTable(result.symMap)
|
||||
|
||||
@@ -87,6 +87,7 @@ type
|
||||
CoProc
|
||||
CoType
|
||||
CoOwnerSig
|
||||
CoIgnoreRange
|
||||
|
||||
proc hashType(c: var MD5Context, t: PType; flags: set[ConsiderFlag])
|
||||
|
||||
@@ -136,7 +137,7 @@ proc hashTree(c: var MD5Context, n: PNode) =
|
||||
of nkStrLit..nkTripleStrLit:
|
||||
c &= n.strVal
|
||||
else:
|
||||
for i in 0.. <n.len: hashTree(c, n.sons[i])
|
||||
for i in 0..<n.len: hashTree(c, n.sons[i])
|
||||
|
||||
proc hashType(c: var MD5Context, t: PType; flags: set[ConsiderFlag]) =
|
||||
if t == nil:
|
||||
@@ -159,14 +160,15 @@ proc hashType(c: var MD5Context, t: PType; flags: set[ConsiderFlag]) =
|
||||
return
|
||||
else:
|
||||
discard
|
||||
c &= char(t.kind)
|
||||
case t.kind
|
||||
of tyBool, tyChar, tyInt..tyUInt64:
|
||||
# no canonicalization for integral types, so that e.g. ``pid_t`` is
|
||||
# produced instead of ``NI``:
|
||||
c &= char(t.kind)
|
||||
if t.sym != nil and {sfImportc, sfExportc} * t.sym.flags != {}:
|
||||
c.hashSym(t.sym)
|
||||
of tyObject, tyEnum:
|
||||
c &= char(t.kind)
|
||||
if t.typeInst != nil:
|
||||
assert t.typeInst.kind == tyGenericInst
|
||||
for i in countup(1, sonsLen(t.typeInst) - 2):
|
||||
@@ -199,26 +201,35 @@ proc hashType(c: var MD5Context, t: PType; flags: set[ConsiderFlag]) =
|
||||
if t.len > 0 and t.sons[0] != nil:
|
||||
hashType c, t.sons[0], flags
|
||||
of tyRef, tyPtr, tyGenericBody, tyVar:
|
||||
c &= char(t.kind)
|
||||
c.hashType t.lastSon, flags
|
||||
if tfVarIsPtr in t.flags: c &= ".varisptr"
|
||||
of tyFromExpr:
|
||||
c &= char(t.kind)
|
||||
c.hashTree(t.n)
|
||||
of tyTuple:
|
||||
c &= char(t.kind)
|
||||
if t.n != nil and CoType notin flags:
|
||||
assert(sonsLen(t.n) == sonsLen(t))
|
||||
for i in countup(0, sonsLen(t.n) - 1):
|
||||
assert(t.n.sons[i].kind == nkSym)
|
||||
c &= t.n.sons[i].sym.name.s
|
||||
c &= ':'
|
||||
c.hashType(t.sons[i], flags)
|
||||
c.hashType(t.sons[i], flags+{CoIgnoreRange})
|
||||
c &= ','
|
||||
else:
|
||||
for i in countup(0, sonsLen(t) - 1): c.hashType t.sons[i], flags
|
||||
of tyRange, tyStatic:
|
||||
#if CoType notin flags:
|
||||
for i in countup(0, sonsLen(t) - 1): c.hashType t.sons[i], flags+{CoIgnoreRange}
|
||||
of tyRange:
|
||||
if CoIgnoreRange notin flags:
|
||||
c &= char(t.kind)
|
||||
c.hashTree(t.n)
|
||||
c.hashType(t.sons[0], flags)
|
||||
of tyStatic:
|
||||
c &= char(t.kind)
|
||||
c.hashTree(t.n)
|
||||
c.hashType(t.sons[0], flags)
|
||||
of tyProc:
|
||||
c &= char(t.kind)
|
||||
c &= (if tfIterator in t.flags: "iterator " else: "proc ")
|
||||
if CoProc in flags and t.n != nil:
|
||||
let params = t.n
|
||||
@@ -230,14 +241,18 @@ proc hashType(c: var MD5Context, t: PType; flags: set[ConsiderFlag]) =
|
||||
c &= ','
|
||||
c.hashType(t.sons[0], flags)
|
||||
else:
|
||||
for i in 0.. <t.len: c.hashType(t.sons[i], flags)
|
||||
for i in 0..<t.len: c.hashType(t.sons[i], flags)
|
||||
c &= char(t.callConv)
|
||||
if CoType notin flags:
|
||||
if tfNoSideEffect in t.flags: c &= ".noSideEffect"
|
||||
if tfThread in t.flags: c &= ".thread"
|
||||
if tfVarargs in t.flags: c &= ".varargs"
|
||||
of tyArray:
|
||||
c &= char(t.kind)
|
||||
for i in 0..<t.len: c.hashType(t.sons[i], flags-{CoIgnoreRange})
|
||||
else:
|
||||
for i in 0.. <t.len: c.hashType(t.sons[i], flags)
|
||||
c &= char(t.kind)
|
||||
for i in 0..<t.len: c.hashType(t.sons[i], flags)
|
||||
if tfNotNil in t.flags and CoType notin flags: c &= "not nil"
|
||||
|
||||
when defined(debugSigHashes):
|
||||
|
||||
@@ -55,6 +55,7 @@ type
|
||||
# a distrinct type
|
||||
typedescMatched*: bool
|
||||
isNoCall*: bool # misused for generic type instantiations C[T]
|
||||
mutabilityProblem*: uint8 # tyVar mismatch
|
||||
inferredTypes: seq[PType] # inferred types during the current signature
|
||||
# matching. they will be reset if the matching
|
||||
# is not successful. may replace the bindings
|
||||
@@ -66,7 +67,6 @@ type
|
||||
# or when the explain pragma is used. may be
|
||||
# triggered with an idetools command in the
|
||||
# future.
|
||||
mutabilityProblem*: uint8 # tyVar mismatch
|
||||
inheritancePenalty: int # to prefer closest father object type
|
||||
|
||||
TTypeRelFlag* = enum
|
||||
@@ -200,7 +200,7 @@ proc sumGeneric(t: PType): int =
|
||||
inc result
|
||||
of tyGenericInvocation, tyTuple, tyProc, tyAnd:
|
||||
result += ord(t.kind in {tyGenericInvocation, tyAnd})
|
||||
for i in 0 .. <t.len:
|
||||
for i in 0 ..< t.len:
|
||||
if t.sons[i] != nil:
|
||||
result += t.sons[i].sumGeneric
|
||||
break
|
||||
@@ -220,11 +220,12 @@ proc sumGeneric(t: PType): int =
|
||||
proc complexDisambiguation(a, b: PType): int =
|
||||
# 'a' matches better if *every* argument matches better or equal than 'b'.
|
||||
var winner = 0
|
||||
for i in 1 .. <min(a.len, b.len):
|
||||
for i in 1 ..< min(a.len, b.len):
|
||||
let x = a.sons[i].sumGeneric
|
||||
let y = b.sons[i].sumGeneric
|
||||
#if ggDebug:
|
||||
# echo "came her ", typeToString(a.sons[i]), " ", typeToString(b.sons[i])
|
||||
#echo "came herA ", typeToString(a.sons[i]), " ", x
|
||||
#echo "came herB ", typeToString(b.sons[i]), " ", y
|
||||
if x != y:
|
||||
if winner == 0:
|
||||
if x > y: winner = 1
|
||||
@@ -239,8 +240,8 @@ proc complexDisambiguation(a, b: PType): int =
|
||||
result = winner
|
||||
when false:
|
||||
var x, y: int
|
||||
for i in 1 .. <a.len: x += a.sons[i].sumGeneric
|
||||
for i in 1 .. <b.len: y += b.sons[i].sumGeneric
|
||||
for i in 1 ..< a.len: x += a.sons[i].sumGeneric
|
||||
for i in 1 ..< b.len: y += b.sons[i].sumGeneric
|
||||
result = x - y
|
||||
|
||||
proc writeMatches*(c: TCandidate) =
|
||||
@@ -275,7 +276,7 @@ proc cmpCandidates*(a, b: TCandidate): int =
|
||||
proc argTypeToString(arg: PNode; prefer: TPreferedDesc): string =
|
||||
if arg.kind in nkSymChoices:
|
||||
result = typeToString(arg[0].typ, prefer)
|
||||
for i in 1 .. <arg.len:
|
||||
for i in 1 ..< arg.len:
|
||||
result.add(" | ")
|
||||
result.add typeToString(arg[i].typ, prefer)
|
||||
elif arg.typ == nil:
|
||||
@@ -389,7 +390,16 @@ proc isConvertibleToRange(f, a: PType): bool =
|
||||
# be less picky for tyRange, as that it is used for array indexing:
|
||||
if f.kind in {tyInt..tyInt64, tyUInt..tyUInt64} and
|
||||
a.kind in {tyInt..tyInt64, tyUInt..tyUInt64}:
|
||||
result = true
|
||||
case f.kind
|
||||
of tyInt, tyInt64: result = true
|
||||
of tyInt8: result = a.kind in {tyInt8, tyInt}
|
||||
of tyInt16: result = a.kind in {tyInt8, tyInt16, tyInt}
|
||||
of tyInt32: result = a.kind in {tyInt8, tyInt16, tyInt32, tyInt}
|
||||
of tyUInt, tyUInt64: result = true
|
||||
of tyUInt8: result = a.kind in {tyUInt8, tyUInt}
|
||||
of tyUInt16: result = a.kind in {tyUInt8, tyUInt16, tyUInt}
|
||||
of tyUInt32: result = a.kind in {tyUInt8, tyUInt16, tyUInt32, tyUInt}
|
||||
else: result = false
|
||||
elif f.kind in {tyFloat..tyFloat128} and
|
||||
a.kind in {tyFloat..tyFloat128}:
|
||||
result = true
|
||||
@@ -580,7 +590,7 @@ proc procTypeRel(c: var TCandidate, f, a: PType): TTypeRelation =
|
||||
|
||||
# Note: We have to do unification for the parameters before the
|
||||
# return type!
|
||||
for i in 1 .. <f.sonsLen:
|
||||
for i in 1 ..< f.sonsLen:
|
||||
checkParam(f.sons[i], a.sons[i])
|
||||
|
||||
if f.sons[0] != nil:
|
||||
@@ -658,7 +668,7 @@ proc matchUserTypeClass*(m: var TCandidate; ff, a: PType): PType =
|
||||
var typeParams: seq[(PSym, PType)]
|
||||
|
||||
if ff.kind == tyUserTypeClassInst:
|
||||
for i in 1 .. <(ff.len - 1):
|
||||
for i in 1 ..< (ff.len - 1):
|
||||
var
|
||||
typeParamName = ff.base.sons[i-1].sym.name
|
||||
typ = ff.sons[i]
|
||||
@@ -1047,9 +1057,10 @@ proc typeRelImpl(c: var TCandidate, f, aOrig: PType,
|
||||
else: isNone
|
||||
|
||||
of tyUserTypeClass, tyUserTypeClassInst:
|
||||
if c.c.matchedConcept != nil:
|
||||
if c.c.matchedConcept != nil and c.c.matchedConcept.depth <= 4:
|
||||
# consider this: 'var g: Node' *within* a concept where 'Node'
|
||||
# is a concept too (tgraph)
|
||||
inc c.c.matchedConcept.depth
|
||||
let x = typeRel(c, a, f, flags + {trDontBind})
|
||||
if x >= isGeneric:
|
||||
return isGeneric
|
||||
@@ -1374,7 +1385,7 @@ proc typeRelImpl(c: var TCandidate, f, aOrig: PType,
|
||||
# XXX: This is very hacky. It should be moved back into liftTypeParam
|
||||
if x.kind in {tyGenericInst, tyArray} and
|
||||
c.calleeSym != nil and
|
||||
c.calleeSym.kind in {skProc, skFunc}:
|
||||
c.calleeSym.kind in {skProc, skFunc} and c.call != nil:
|
||||
let inst = prepareMetatypeForSigmatch(c.c, c.bindings, c.call.info, f)
|
||||
return typeRel(c, inst, a)
|
||||
|
||||
@@ -1451,8 +1462,13 @@ proc typeRelImpl(c: var TCandidate, f, aOrig: PType,
|
||||
of tyOr:
|
||||
considerPreviousT:
|
||||
result = isNone
|
||||
let oldInheritancePenalty = c.inheritancePenalty
|
||||
var maxInheritance = 0
|
||||
for branch in f.sons:
|
||||
c.inheritancePenalty = 0
|
||||
let x = typeRel(c, branch, aOrig)
|
||||
maxInheritance = max(maxInheritance, c.inheritancePenalty)
|
||||
|
||||
# 'or' implies maximum matching result:
|
||||
if x > result: result = x
|
||||
if result >= isSubtype:
|
||||
@@ -1460,6 +1476,7 @@ proc typeRelImpl(c: var TCandidate, f, aOrig: PType,
|
||||
bindingRet result
|
||||
else:
|
||||
result = isNone
|
||||
c.inheritancePenalty = oldInheritancePenalty + maxInheritance
|
||||
|
||||
of tyNot:
|
||||
considerPreviousT:
|
||||
@@ -1550,11 +1567,19 @@ proc typeRelImpl(c: var TCandidate, f, aOrig: PType,
|
||||
result = isNone
|
||||
else:
|
||||
if f.sonsLen > 0 and f.sons[0].kind != tyNone:
|
||||
let oldInheritancePenalty = c.inheritancePenalty
|
||||
result = typeRel(c, f.lastSon, a, flags + {trDontBind})
|
||||
if doBind and result notin {isNone, isGeneric}:
|
||||
let concrete = concreteType(c, a)
|
||||
if concrete == nil: return isNone
|
||||
put(c, f, concrete)
|
||||
# bug #6526
|
||||
if result in {isEqual, isSubtype}:
|
||||
# 'T: Class' is a *better* match than just 'T'
|
||||
# but 'T: Subclass' is even better:
|
||||
c.inheritancePenalty = oldInheritancePenalty - c.inheritancePenalty -
|
||||
100 * ord(result == isEqual)
|
||||
result = isGeneric
|
||||
else:
|
||||
result = isGeneric
|
||||
|
||||
@@ -1588,7 +1613,7 @@ proc typeRelImpl(c: var TCandidate, f, aOrig: PType,
|
||||
if not exprStructuralEquivalent(f.n, aOrig.n):
|
||||
result = isNone
|
||||
if result != isNone: put(c, f, aOrig)
|
||||
elif aOrig.n != nil:
|
||||
elif aOrig.n != nil and aOrig.n.typ != nil:
|
||||
result = typeRel(c, f.lastSon, aOrig.n.typ)
|
||||
if result != isNone:
|
||||
var boundType = newTypeWithSons(c.c, tyStatic, @[aOrig.n.typ])
|
||||
@@ -2218,7 +2243,7 @@ proc matchesAux(c: PContext, n, nOrig: PNode,
|
||||
proc semFinishOperands*(c: PContext, n: PNode) =
|
||||
# this needs to be called to ensure that after overloading resolution every
|
||||
# argument has been sem'checked:
|
||||
for i in 1 .. <n.len:
|
||||
for i in 1 ..< n.len:
|
||||
n.sons[i] = prepareOperand(c, n.sons[i])
|
||||
|
||||
proc partialMatch*(c: PContext, n, nOrig: PNode, m: var TCandidate) =
|
||||
@@ -2288,7 +2313,8 @@ proc instTypeBoundOp*(c: PContext; dc: PSym; t: PType; info: TLineInfo;
|
||||
localError(info, errGenerated, "cannot instantiate '" & dc.name.s & "'")
|
||||
else:
|
||||
result = c.semGenerateInstance(c, dc, m.bindings, info)
|
||||
assert sfFromGeneric in result.flags
|
||||
if op == attachedDeepCopy:
|
||||
assert sfFromGeneric in result.flags
|
||||
|
||||
include suggest
|
||||
|
||||
|
||||
@@ -21,7 +21,7 @@
|
||||
import
|
||||
intsets, strutils, options, ast, astalgo, trees, treetab, msgs, os,
|
||||
idents, renderer, types, passes, semfold, magicsys, cgmeth, rodread,
|
||||
lambdalifting, sempass2, lowerings, lookups, destroyer
|
||||
lambdalifting, sempass2, lowerings, lookups, destroyer, liftlocals
|
||||
|
||||
type
|
||||
PTransNode* = distinct PNode
|
||||
@@ -231,7 +231,7 @@ proc freshLabels(c: PTransf, n: PNode; symMap: var TIdTable) =
|
||||
let x = PSym(idTableGet(symMap, n.sym))
|
||||
if x != nil: n.sym = x
|
||||
else:
|
||||
for i in 0 .. <safeLen(n): freshLabels(c, n.sons[i], symMap)
|
||||
for i in 0 ..< safeLen(n): freshLabels(c, n.sons[i], symMap)
|
||||
|
||||
proc transformBlock(c: PTransf, n: PNode): PTransNode =
|
||||
var labl: PSym
|
||||
@@ -275,7 +275,7 @@ proc transformWhile(c: PTransf; n: PNode): PTransNode =
|
||||
var body = newTransNode(n)
|
||||
for i in 0..n.len-2:
|
||||
body[i] = transform(c, n.sons[i])
|
||||
body[<n.len] = transformLoopBody(c, n.sons[<n.len])
|
||||
body[n.len-1] = transformLoopBody(c, n.sons[n.len-1])
|
||||
result[1] = body
|
||||
discard c.breakSyms.pop
|
||||
|
||||
@@ -365,16 +365,22 @@ proc transformAddrDeref(c: PTransf, n: PNode, a, b: TNodeKind): PTransNode =
|
||||
# addr ( nkConv ( deref ( x ) ) ) --> nkConv(x)
|
||||
n.sons[0].sons[0] = m.sons[0]
|
||||
result = PTransNode(n.sons[0])
|
||||
if n.typ.skipTypes(abstractVar).kind != tyOpenArray:
|
||||
PNode(result).typ = n.typ
|
||||
of nkHiddenStdConv, nkHiddenSubConv, nkConv:
|
||||
var m = n.sons[0].sons[1]
|
||||
if m.kind == a or m.kind == b:
|
||||
# addr ( nkConv ( deref ( x ) ) ) --> nkConv(x)
|
||||
n.sons[0].sons[1] = m.sons[0]
|
||||
result = PTransNode(n.sons[0])
|
||||
if n.typ.skipTypes(abstractVar).kind != tyOpenArray:
|
||||
PNode(result).typ = n.typ
|
||||
else:
|
||||
if n.sons[0].kind == a or n.sons[0].kind == b:
|
||||
# addr ( deref ( x )) --> x
|
||||
result = PTransNode(n.sons[0].sons[0])
|
||||
if n.typ.skipTypes(abstractVar).kind != tyOpenArray:
|
||||
PNode(result).typ = n.typ
|
||||
|
||||
proc generateThunk(prc: PNode, dest: PType): PNode =
|
||||
## Converts 'prc' into '(thunk, nil)' so that it's compatible with
|
||||
@@ -510,7 +516,7 @@ proc findWrongOwners(c: PTransf, n: PNode) =
|
||||
internalError(x.info, "bah " & x.sym.name.s & " " &
|
||||
x.sym.owner.name.s & " " & getCurrOwner(c).name.s)
|
||||
else:
|
||||
for i in 0 .. <safeLen(n): findWrongOwners(c, n.sons[i])
|
||||
for i in 0 ..< safeLen(n): findWrongOwners(c, n.sons[i])
|
||||
|
||||
proc transformFor(c: PTransf, n: PNode): PTransNode =
|
||||
# generate access statements for the parameters (unless they are constant)
|
||||
@@ -640,7 +646,7 @@ proc transformArrayAccess(c: PTransf, n: PNode): PTransNode =
|
||||
result = n.PTransNode
|
||||
else:
|
||||
result = newTransNode(n)
|
||||
for i in 0 .. < n.len:
|
||||
for i in 0 ..< n.len:
|
||||
result[i] = transform(c, skipConv(n.sons[i]))
|
||||
|
||||
proc getMergeOp(n: PNode): PSym =
|
||||
@@ -687,7 +693,7 @@ proc transformCall(c: PTransf, n: PNode): PTransNode =
|
||||
inc(j)
|
||||
add(result, a.PTransNode)
|
||||
if len(result) == 2: result = result[1]
|
||||
elif magic in {mNBindSym, mTypeOf}:
|
||||
elif magic in {mNBindSym, mTypeOf, mRunnableExamples}:
|
||||
# for bindSym(myconst) we MUST NOT perform constant folding:
|
||||
result = n.PTransNode
|
||||
elif magic == mProcCall:
|
||||
@@ -744,7 +750,7 @@ proc dontInlineConstant(orig, cnst: PNode): bool {.inline.} =
|
||||
|
||||
proc commonOptimizations*(c: PSym, n: PNode): PNode =
|
||||
result = n
|
||||
for i in 0 .. < n.safeLen:
|
||||
for i in 0 ..< n.safeLen:
|
||||
result.sons[i] = commonOptimizations(c, n.sons[i])
|
||||
var op = getMergeOp(n)
|
||||
if (op != nil) and (op.magic != mNone) and (sonsLen(n) >= 3):
|
||||
@@ -785,7 +791,7 @@ proc transform(c: PTransf, n: PNode): PTransNode =
|
||||
case n.kind
|
||||
of nkSym:
|
||||
result = transformSym(c, n)
|
||||
of nkEmpty..pred(nkSym), succ(nkSym)..nkNilLit:
|
||||
of nkEmpty..pred(nkSym), succ(nkSym)..nkNilLit, nkComesFrom:
|
||||
# nothing to be done for leaves:
|
||||
result = PTransNode(n)
|
||||
of nkBracketExpr: result = transformArrayAccess(c, n)
|
||||
@@ -908,7 +914,7 @@ proc processTransf(c: PTransf, n: PNode, owner: PSym): PNode =
|
||||
# Note: For interactive mode we cannot call 'passes.skipCodegen' and skip
|
||||
# this step! We have to rely that the semantic pass transforms too errornous
|
||||
# nodes into an empty node.
|
||||
if c.fromCache or nfTransf in n.flags: return n
|
||||
if c.rd != nil or nfTransf in n.flags: return n
|
||||
pushTransCon(c, newTransCon(owner))
|
||||
result = PNode(transform(c, n))
|
||||
popTransCon(c)
|
||||
@@ -972,6 +978,7 @@ proc transformBody*(module: PSym, n: PNode, prc: PSym): PNode =
|
||||
liftDefer(c, result)
|
||||
#result = liftLambdas(prc, result)
|
||||
when useEffectSystem: trackProc(prc, result)
|
||||
result = liftLocalsIfRequested(prc, result)
|
||||
if c.needsDestroyPass and newDestructors:
|
||||
result = injectDestructorCalls(prc, result)
|
||||
incl(result.flags, nfTransf)
|
||||
|
||||
@@ -98,7 +98,7 @@ proc isDeepConstExpr*(n: PNode): bool =
|
||||
of nkExprEqExpr, nkExprColonExpr, nkHiddenStdConv, nkHiddenSubConv:
|
||||
result = isDeepConstExpr(n.sons[1])
|
||||
of nkCurly, nkBracket, nkPar, nkObjConstr, nkClosure, nkRange:
|
||||
for i in ord(n.kind == nkObjConstr) .. <n.len:
|
||||
for i in ord(n.kind == nkObjConstr) ..< n.len:
|
||||
if not isDeepConstExpr(n.sons[i]): return false
|
||||
if n.typ.isNil: result = true
|
||||
else:
|
||||
|
||||
@@ -14,7 +14,8 @@ import
|
||||
|
||||
type
|
||||
TPreferedDesc* = enum
|
||||
preferName, preferDesc, preferExported, preferModuleInfo, preferGenericArg
|
||||
preferName, preferDesc, preferExported, preferModuleInfo, preferGenericArg,
|
||||
preferTypeName
|
||||
|
||||
proc typeToString*(typ: PType; prefer: TPreferedDesc = preferName): string
|
||||
template `$`*(typ: PType): string = typeToString(typ)
|
||||
@@ -394,7 +395,7 @@ const
|
||||
"and", "or", "not", "any", "static", "TypeFromExpr", "FieldAccessor",
|
||||
"void"]
|
||||
|
||||
const preferToResolveSymbols = {preferName, preferModuleInfo, preferGenericArg}
|
||||
const preferToResolveSymbols = {preferName, preferTypeName, preferModuleInfo, preferGenericArg}
|
||||
|
||||
template bindConcreteTypeToUserTypeClass*(tc, concrete: PType) =
|
||||
tc.sons.safeAdd concrete
|
||||
@@ -420,7 +421,7 @@ proc typeToString(typ: PType, prefer: TPreferedDesc = preferName): string =
|
||||
sfAnon notin t.sym.flags:
|
||||
if t.kind == tyInt and isIntLit(t):
|
||||
result = t.sym.name.s & " literal(" & $t.n.intVal & ")"
|
||||
elif prefer == preferName or t.sym.owner.isNil:
|
||||
elif prefer in {preferName, preferTypeName} or t.sym.owner.isNil:
|
||||
result = t.sym.name.s
|
||||
if t.kind == tyGenericParam and t.sons != nil and t.sonsLen > 0:
|
||||
result.add ": "
|
||||
@@ -518,7 +519,7 @@ proc typeToString(typ: PType, prefer: TPreferedDesc = preferName): string =
|
||||
result = "openarray[" & typeToString(t.sons[0]) & ']'
|
||||
of tyDistinct:
|
||||
result = "distinct " & typeToString(t.sons[0],
|
||||
if prefer == preferModuleInfo: preferModuleInfo else: preferName)
|
||||
if prefer == preferModuleInfo: preferModuleInfo else: preferTypeName)
|
||||
of tyTuple:
|
||||
# we iterate over t.sons here, because t.n may be nil
|
||||
if t.n != nil:
|
||||
@@ -532,7 +533,8 @@ proc typeToString(typ: PType, prefer: TPreferedDesc = preferName): string =
|
||||
elif sonsLen(t) == 0:
|
||||
result = "tuple[]"
|
||||
else:
|
||||
result = "("
|
||||
if prefer == preferTypeName: result = "("
|
||||
else: result = "tuple of ("
|
||||
for i in countup(0, sonsLen(t) - 1):
|
||||
add(result, typeToString(t.sons[i]))
|
||||
if i < sonsLen(t) - 1: add(result, ", ")
|
||||
@@ -742,7 +744,7 @@ proc equalParam(a, b: PSym): TParamsEquality =
|
||||
proc sameConstraints(a, b: PNode): bool =
|
||||
if isNil(a) and isNil(b): return true
|
||||
internalAssert a.len == b.len
|
||||
for i in 1 .. <a.len:
|
||||
for i in 1 ..< a.len:
|
||||
if not exprStructuralEquivalent(a[i].sym.constraint,
|
||||
b[i].sym.constraint):
|
||||
return false
|
||||
@@ -970,7 +972,12 @@ proc sameTypeAux(x, y: PType, c: var TSameTypeClosure): bool =
|
||||
tyArray, tyProc, tyVarargs, tyOrdinal, tyTypeClasses, tyOpt:
|
||||
cycleCheck()
|
||||
if a.kind == tyUserTypeClass and a.n != nil: return a.n == b.n
|
||||
result = sameChildrenAux(a, b, c) and sameFlags(a, b)
|
||||
result = sameChildrenAux(a, b, c)
|
||||
if result:
|
||||
if IgnoreTupleFields in c.flags:
|
||||
result = a.flags * {tfVarIsPtr} == b.flags * {tfVarIsPtr}
|
||||
else:
|
||||
result = sameFlags(a, b)
|
||||
if result and ExactGcSafety in c.flags:
|
||||
result = a.flags * {tfThread} == b.flags * {tfThread}
|
||||
if result and a.kind == tyProc:
|
||||
@@ -990,6 +997,7 @@ proc sameTypeAux(x, y: PType, c: var TSameTypeClosure): bool =
|
||||
proc sameBackendType*(x, y: PType): bool =
|
||||
var c = initSameTypeClosure()
|
||||
c.flags.incl IgnoreTupleFields
|
||||
c.cmp = dcEqIgnoreDistinct
|
||||
result = sameTypeAux(x, y, c)
|
||||
|
||||
proc compareTypes*(x, y: PType,
|
||||
@@ -1509,7 +1517,7 @@ proc isCompileTimeOnly*(t: PType): bool {.inline.} =
|
||||
proc containsCompileTimeOnly*(t: PType): bool =
|
||||
if isCompileTimeOnly(t): return true
|
||||
if t.sons != nil:
|
||||
for i in 0 .. <t.sonsLen:
|
||||
for i in 0 ..< t.sonsLen:
|
||||
if t.sons[i] != nil and isCompileTimeOnly(t.sons[i]):
|
||||
return true
|
||||
return false
|
||||
|
||||
@@ -20,7 +20,7 @@ proc renderPlainSymbolName*(n: PNode): string =
|
||||
result = ""
|
||||
case n.kind
|
||||
of nkPostfix, nkAccQuoted:
|
||||
result = renderPlainSymbolName(n[<n.len])
|
||||
result = renderPlainSymbolName(n[n.len-1])
|
||||
of nkIdent:
|
||||
result = n.ident.s
|
||||
of nkSym:
|
||||
@@ -58,8 +58,8 @@ proc renderType(n: PNode): string =
|
||||
assert params.kind == nkFormalParams
|
||||
assert len(params) > 0
|
||||
result = "proc("
|
||||
for i in 1 .. <len(params): result.add(renderType(params[i]) & ',')
|
||||
result[<len(result)] = ')'
|
||||
for i in 1 ..< len(params): result.add(renderType(params[i]) & ',')
|
||||
result[len(result)-1] = ')'
|
||||
else:
|
||||
result = "proc"
|
||||
of nkIdentDefs:
|
||||
@@ -67,18 +67,18 @@ proc renderType(n: PNode): string =
|
||||
let typePos = len(n) - 2
|
||||
let typeStr = renderType(n[typePos])
|
||||
result = typeStr
|
||||
for i in 1 .. <typePos:
|
||||
for i in 1 ..< typePos:
|
||||
assert n[i].kind == nkIdent
|
||||
result.add(',' & typeStr)
|
||||
of nkTupleTy:
|
||||
result = "tuple["
|
||||
for i in 0 .. <len(n): result.add(renderType(n[i]) & ',')
|
||||
result[<len(result)] = ']'
|
||||
for i in 0 ..< len(n): result.add(renderType(n[i]) & ',')
|
||||
result[len(result)-1] = ']'
|
||||
of nkBracketExpr:
|
||||
assert len(n) >= 2
|
||||
result = renderType(n[0]) & '['
|
||||
for i in 1 .. <len(n): result.add(renderType(n[i]) & ',')
|
||||
result[<len(result)] = ']'
|
||||
for i in 1 ..< len(n): result.add(renderType(n[i]) & ',')
|
||||
result[len(result)-1] = ']'
|
||||
else: result = ""
|
||||
assert(not result.isNil)
|
||||
|
||||
@@ -91,7 +91,7 @@ proc renderParamTypes(found: var seq[string], n: PNode) =
|
||||
## generator does include the information.
|
||||
case n.kind
|
||||
of nkFormalParams:
|
||||
for i in 1 .. <len(n): renderParamTypes(found, n[i])
|
||||
for i in 1 ..< len(n): renderParamTypes(found, n[i])
|
||||
of nkIdentDefs:
|
||||
# These are parameter names + type + default value node.
|
||||
let typePos = len(n) - 2
|
||||
@@ -102,7 +102,7 @@ proc renderParamTypes(found: var seq[string], n: PNode) =
|
||||
let typ = n[typePos+1].typ
|
||||
if not typ.isNil: typeStr = typeToString(typ, preferExported)
|
||||
if typeStr.len < 1: return
|
||||
for i in 0 .. <typePos:
|
||||
for i in 0 ..< typePos:
|
||||
found.add(typeStr)
|
||||
else:
|
||||
internalError(n.info, "renderParamTypes(found,n) with " & $n.kind)
|
||||
|
||||
@@ -322,7 +322,7 @@ proc opConv*(dest: var TFullReg, src: TFullReg, desttyp, srctyp: PType): bool =
|
||||
if x <% n.len and (let f = n.sons[x].sym; f.position == x):
|
||||
dest.node.strVal = if f.ast.isNil: f.name.s else: f.ast.strVal
|
||||
else:
|
||||
for i in 0.. <n.len:
|
||||
for i in 0..<n.len:
|
||||
if n.sons[i].kind != nkSym: internalError("opConv for enum")
|
||||
let f = n.sons[i].sym
|
||||
if f.position == x:
|
||||
@@ -431,7 +431,7 @@ proc setLenSeq(c: PCtx; node: PNode; newLen: int; info: TLineInfo) =
|
||||
setLen(node.sons, newLen)
|
||||
if oldLen < newLen:
|
||||
# TODO: This is still not correct for tyPtr, tyRef default value
|
||||
for i in oldLen .. <newLen:
|
||||
for i in oldLen ..< newLen:
|
||||
node.sons[i] = newNodeI(typeKind, info)
|
||||
|
||||
proc rawExecute(c: PCtx, start: int, tos: PStackFrame): TFullReg =
|
||||
@@ -1078,7 +1078,7 @@ proc rawExecute(c: PCtx, start: int, tos: PStackFrame): TFullReg =
|
||||
regs[ra].node = newNodeI(nkBracket, c.debug[pc])
|
||||
regs[ra].node.typ = typ
|
||||
newSeq(regs[ra].node.sons, count)
|
||||
for i in 0 .. <count:
|
||||
for i in 0 ..< count:
|
||||
regs[ra].node.sons[i] = getNullValue(typ.sons[0], c.debug[pc])
|
||||
of opcNewStr:
|
||||
decodeB(rkNode)
|
||||
@@ -1213,7 +1213,7 @@ proc rawExecute(c: PCtx, start: int, tos: PStackFrame): TFullReg =
|
||||
var u = regs[rb].node
|
||||
if u.kind notin {nkEmpty..nkNilLit}:
|
||||
# XXX can be optimized:
|
||||
for i in 0.. <x.len: u.add(x.sons[i])
|
||||
for i in 0..<x.len: u.add(x.sons[i])
|
||||
else:
|
||||
stackTrace(c, tos, pc, errGenerated, "cannot add to node kind: " & $u.kind)
|
||||
regs[ra].node = u
|
||||
@@ -1555,7 +1555,7 @@ proc execProc*(c: PCtx; sym: PSym; args: openArray[PNode]): PNode =
|
||||
if not isEmptyType(sym.typ.sons[0]) or sym.kind == skMacro:
|
||||
putIntoReg(tos.slots[0], getNullValue(sym.typ.sons[0], sym.info))
|
||||
# XXX We could perform some type checking here.
|
||||
for i in 1.. <sym.typ.len:
|
||||
for i in 1..<sym.typ.len:
|
||||
putIntoReg(tos.slots[i], args[i-1])
|
||||
|
||||
result = rawExecute(c, start, tos).regToNode
|
||||
@@ -1637,7 +1637,7 @@ proc evalConstExprAux(module: PSym; cache: IdentCache; prc: PSym, n: PNode,
|
||||
when debugEchoCode: c.echoCode start
|
||||
var tos = PStackFrame(prc: prc, comesFrom: 0, next: nil)
|
||||
newSeq(tos.slots, c.prc.maxSlots)
|
||||
#for i in 0 .. <c.prc.maxSlots: tos.slots[i] = newNode(nkEmpty)
|
||||
#for i in 0 ..< c.prc.maxSlots: tos.slots[i] = newNode(nkEmpty)
|
||||
result = rawExecute(c, start, tos).regToNode
|
||||
if result.info.line < 0: result.info = n.info
|
||||
|
||||
@@ -1670,7 +1670,7 @@ proc setupMacroParam(x: PNode, typ: PType): TFullReg =
|
||||
|
||||
iterator genericParamsInMacroCall*(macroSym: PSym, call: PNode): (PSym, PNode) =
|
||||
let gp = macroSym.ast[genericParamsPos]
|
||||
for i in 0 .. <gp.len:
|
||||
for i in 0 ..< gp.len:
|
||||
let genericParam = gp[i].sym
|
||||
let posInCall = macroSym.typ.len + i
|
||||
yield (genericParam, call[posInCall])
|
||||
@@ -1688,8 +1688,7 @@ proc evalMacroCall*(module: PSym; cache: IdentCache, n, nOrig: PNode,
|
||||
# arity here too:
|
||||
if sym.typ.len > n.safeLen and sym.typ.len > 1:
|
||||
globalError(n.info, "in call '$#' got $#, but expected $# argument(s)" % [
|
||||
n.renderTree,
|
||||
$ <n.safeLen, $ <sym.typ.len])
|
||||
n.renderTree, $(n.safeLen-1), $(sym.typ.len-1)])
|
||||
|
||||
setupGlobalCtx(module, cache)
|
||||
var c = globalCtx
|
||||
@@ -1713,11 +1712,11 @@ proc evalMacroCall*(module: PSym; cache: IdentCache, n, nOrig: PNode,
|
||||
tos.slots[0].node = newNodeI(nkEmpty, n.info)
|
||||
|
||||
# setup parameters:
|
||||
for i in 1.. <sym.typ.len:
|
||||
for i in 1..<sym.typ.len:
|
||||
tos.slots[i] = setupMacroParam(n.sons[i], sym.typ.sons[i])
|
||||
|
||||
let gp = sym.ast[genericParamsPos]
|
||||
for i in 0 .. <gp.len:
|
||||
for i in 0 ..< gp.len:
|
||||
if sfImmediate notin sym.flags:
|
||||
let idx = sym.typ.len + i
|
||||
if idx < n.len:
|
||||
@@ -1732,7 +1731,7 @@ proc evalMacroCall*(module: PSym; cache: IdentCache, n, nOrig: PNode,
|
||||
c.callsite = nil
|
||||
globalError(n.info, "static[T] or typedesc nor supported for .immediate macros")
|
||||
# temporary storage:
|
||||
#for i in L .. <maxSlots: tos.slots[i] = newNode(nkEmpty)
|
||||
#for i in L ..< maxSlots: tos.slots[i] = newNode(nkEmpty)
|
||||
result = rawExecute(c, start, tos).regToNode
|
||||
if result.info.line < 0: result.info = n.info
|
||||
if cyclicTree(result): globalError(n.info, errCyclicTree)
|
||||
|
||||
@@ -41,7 +41,7 @@ proc mapTypeToBracketX(name: string; m: TMagic; t: PType; info: TLineInfo;
|
||||
inst=false): PNode =
|
||||
result = newNodeIT(nkBracketExpr, if t.n.isNil: info else: t.n.info, t)
|
||||
result.add atomicTypeX(name, m, t, info)
|
||||
for i in 0 .. < t.len:
|
||||
for i in 0 ..< t.len:
|
||||
if t.sons[i] == nil:
|
||||
let void = atomicTypeX("void", mVoid, t, info)
|
||||
void.typ = newType(tyVoid, t.owner)
|
||||
@@ -84,10 +84,10 @@ proc mapTypeToAstX(t: PType; info: TLineInfo;
|
||||
|
||||
if inst:
|
||||
if t.sym != nil: # if this node has a symbol
|
||||
if allowRecursion: # getTypeImpl behavior: turn off recursion
|
||||
allowRecursion = false
|
||||
else: # getTypeInst behavior: return symbol
|
||||
if not allowRecursion: # getTypeInst behavior: return symbol
|
||||
return atomicType(t.sym)
|
||||
#else: # getTypeImpl behavior: turn off recursion
|
||||
# allowRecursion = false
|
||||
|
||||
case t.kind
|
||||
of tyNone: result = atomicType("none", mNone)
|
||||
@@ -119,24 +119,27 @@ proc mapTypeToAstX(t: PType; info: TLineInfo;
|
||||
result = atomicType("typeDesc", mTypeDesc)
|
||||
of tyGenericInvocation:
|
||||
result = newNodeIT(nkBracketExpr, if t.n.isNil: info else: t.n.info, t)
|
||||
for i in 0 .. < t.len:
|
||||
for i in 0 ..< t.len:
|
||||
result.add mapTypeToAst(t.sons[i], info)
|
||||
of tyGenericInst, tyAlias:
|
||||
of tyGenericInst:
|
||||
if inst:
|
||||
if allowRecursion:
|
||||
result = mapTypeToAstR(t.lastSon, info)
|
||||
else:
|
||||
result = newNodeX(nkBracketExpr)
|
||||
result.add mapTypeToAst(t.lastSon, info)
|
||||
for i in 1 .. < t.len-1:
|
||||
#result.add mapTypeToAst(t.lastSon, info)
|
||||
result.add mapTypeToAst(t[0], info)
|
||||
for i in 1 ..< t.len-1:
|
||||
result.add mapTypeToAst(t.sons[i], info)
|
||||
else:
|
||||
result = mapTypeToAstX(t.lastSon, info, inst, allowRecursion)
|
||||
of tyGenericBody:
|
||||
if inst:
|
||||
result = mapTypeToAstX(t.lastSon, info, inst, true)
|
||||
result = mapTypeToAstR(t.lastSon, info)
|
||||
else:
|
||||
result = mapTypeToAst(t.lastSon, info)
|
||||
of tyAlias:
|
||||
result = mapTypeToAstX(t.lastSon, info, inst, allowRecursion)
|
||||
of tyOrdinal:
|
||||
result = mapTypeToAst(t.lastSon, info)
|
||||
of tyDistinct:
|
||||
|
||||
@@ -30,7 +30,7 @@
|
||||
import
|
||||
strutils, ast, astalgo, types, msgs, renderer, vmdef,
|
||||
trees, intsets, rodread, magicsys, options, lowerings
|
||||
|
||||
import platform
|
||||
from os import splitFile
|
||||
|
||||
when hasFFI:
|
||||
@@ -401,7 +401,7 @@ proc sameConstant*(a, b: PNode): bool =
|
||||
|
||||
proc genLiteral(c: PCtx; n: PNode): int =
|
||||
# types do not matter here:
|
||||
for i in 0 .. <c.constants.len:
|
||||
for i in 0 ..< c.constants.len:
|
||||
if sameConstant(c.constants[i], n): return i
|
||||
result = rawGenLiteral(c, n)
|
||||
|
||||
@@ -430,7 +430,7 @@ proc genCase(c: PCtx; n: PNode; dest: var TDest) =
|
||||
c.gen(n.sons[0], tmp)
|
||||
# branch tmp, codeIdx
|
||||
# fjmp elseLabel
|
||||
for i in 1 .. <n.len:
|
||||
for i in 1 ..< n.len:
|
||||
let it = n.sons[i]
|
||||
if it.len == 1:
|
||||
# else stmt:
|
||||
@@ -460,7 +460,7 @@ proc genTry(c: PCtx; n: PNode; dest: var TDest) =
|
||||
c.gen(n.sons[0], dest)
|
||||
c.clearDest(n, dest)
|
||||
c.patch(elsePos)
|
||||
for i in 1 .. <n.len:
|
||||
for i in 1 ..< n.len:
|
||||
let it = n.sons[i]
|
||||
if it.kind != nkFinally:
|
||||
var blen = len(it)
|
||||
@@ -518,7 +518,7 @@ proc genCall(c: PCtx; n: PNode; dest: var TDest) =
|
||||
let x = c.getTempRange(n.len, slotTempUnknown)
|
||||
# varargs need 'opcSetType' for the FFI support:
|
||||
let fntyp = skipTypes(n.sons[0].typ, abstractInst)
|
||||
for i in 0.. <n.len:
|
||||
for i in 0..<n.len:
|
||||
#if i > 0 and i < sonsLen(fntyp):
|
||||
# let paramType = fntyp.n.sons[i]
|
||||
# if paramType.typ.isCompileTimeOnly: continue
|
||||
@@ -761,6 +761,49 @@ proc genCard(c: PCtx; n: PNode; dest: var TDest) =
|
||||
c.gABC(n, opcCard, dest, tmp)
|
||||
c.freeTemp(tmp)
|
||||
|
||||
proc genIntCast(c: PCtx; n: PNode; dest: var TDest) =
|
||||
const allowedIntegers = {tyInt..tyInt64, tyUInt..tyUInt64, tyChar}
|
||||
var signedIntegers = {tyInt8..tyInt32}
|
||||
var unsignedIntegers = {tyUInt8..tyUInt32, tyChar}
|
||||
let src = n.sons[1].typ.skipTypes(abstractRange)#.kind
|
||||
let dst = n.sons[0].typ.skipTypes(abstractRange)#.kind
|
||||
let src_size = src.getSize
|
||||
|
||||
if platform.intSize < 8:
|
||||
signedIntegers.incl(tyInt)
|
||||
unsignedIntegers.incl(tyUInt)
|
||||
if src_size == dst.getSize and src.kind in allowedIntegers and
|
||||
dst.kind in allowedIntegers:
|
||||
let tmp = c.genx(n.sons[1])
|
||||
var tmp2 = c.getTemp(n.sons[1].typ)
|
||||
let tmp3 = c.getTemp(n.sons[1].typ)
|
||||
if dest < 0: dest = c.getTemp(n[0].typ)
|
||||
proc mkIntLit(ival: int): int =
|
||||
result = genLiteral(c, newIntTypeNode(nkIntLit, ival, getSysType(tyInt)))
|
||||
if src.kind in unsignedIntegers and dst.kind in signedIntegers:
|
||||
# cast unsigned to signed integer of same size
|
||||
# signedVal = (unsignedVal xor offset) -% offset
|
||||
let offset = 1 shl (src_size * 8 - 1)
|
||||
c.gABx(n, opcLdConst, tmp2, mkIntLit(offset))
|
||||
c.gABC(n, opcBitxorInt, tmp3, tmp, tmp2)
|
||||
c.gABC(n, opcSubInt, dest, tmp3, tmp2)
|
||||
elif src.kind in signedIntegers and dst.kind in unsignedIntegers:
|
||||
# cast signed to unsigned integer of same size
|
||||
# unsignedVal = (offset +% signedVal +% 1) and offset
|
||||
let offset = (1 shl (src_size * 8)) - 1
|
||||
c.gABx(n, opcLdConst, tmp2, mkIntLit(offset))
|
||||
c.gABx(n, opcLdConst, dest, mkIntLit(offset+1))
|
||||
c.gABC(n, opcAddu, tmp3, tmp, dest)
|
||||
c.gABC(n, opcNarrowU, tmp3, TRegister(src_size*8))
|
||||
c.gABC(n, opcBitandInt, dest, tmp3, tmp2)
|
||||
else:
|
||||
c.gABC(n, opcAsgnInt, dest, tmp)
|
||||
c.freeTemp(tmp)
|
||||
c.freeTemp(tmp2)
|
||||
c.freeTemp(tmp3)
|
||||
else:
|
||||
globalError(n.info, errGenerated, "VM is only allowed to 'cast' between integers of same size")
|
||||
|
||||
proc genMagic(c: PCtx; n: PNode; dest: var TDest; m: TMagic) =
|
||||
case m
|
||||
of mAnd: c.genAndOr(n, opcFJmp, dest)
|
||||
@@ -995,7 +1038,7 @@ proc genMagic(c: PCtx; n: PNode; dest: var TDest; m: TMagic) =
|
||||
let n = n[1].skipConv
|
||||
let x = c.getTempRange(n.len, slotTempUnknown)
|
||||
internalAssert n.kind == nkBracket
|
||||
for i in 0.. <n.len:
|
||||
for i in 0..<n.len:
|
||||
var r: TRegister = x+i
|
||||
c.gen(n.sons[i], r)
|
||||
c.gABC(n, opcEcho, x, n.len)
|
||||
@@ -1130,6 +1173,8 @@ proc genMagic(c: PCtx; n: PNode; dest: var TDest; m: TMagic) =
|
||||
# produces a value
|
||||
else:
|
||||
globalError(n.info, "expandToAst requires a call expression")
|
||||
of mRunnableExamples:
|
||||
discard "just ignore any call to runnableExamples"
|
||||
else:
|
||||
# mGCref, mGCunref,
|
||||
globalError(n.info, "cannot generate code for: " & $m)
|
||||
@@ -1645,7 +1690,7 @@ proc genObjConstr(c: PCtx, n: PNode, dest: var TDest) =
|
||||
c.gABx(n, opcNew, dest, c.genType(t.sons[0]))
|
||||
else:
|
||||
c.gABx(n, opcLdNull, dest, c.genType(n.typ))
|
||||
for i in 1.. <n.len:
|
||||
for i in 1..<n.len:
|
||||
let it = n.sons[i]
|
||||
if it.kind == nkExprColonExpr and it.sons[0].kind == nkSym:
|
||||
let idx = genField(it.sons[0])
|
||||
@@ -1660,7 +1705,7 @@ proc genTupleConstr(c: PCtx, n: PNode, dest: var TDest) =
|
||||
if dest < 0: dest = c.getTemp(n.typ)
|
||||
c.gABx(n, opcLdNull, dest, c.genType(n.typ))
|
||||
# XXX x = (x.old, 22) produces wrong code ... stupid self assignments
|
||||
for i in 0.. <n.len:
|
||||
for i in 0..<n.len:
|
||||
let it = n.sons[i]
|
||||
if it.kind == nkExprColonExpr:
|
||||
let idx = genField(it.sons[0])
|
||||
@@ -1773,8 +1818,8 @@ proc gen(c: PCtx; n: PNode; dest: var TDest; flags: TGenFlags = {}) =
|
||||
of nkAddr, nkHiddenAddr: genAddrDeref(c, n, dest, opcAddrNode, flags)
|
||||
of nkIfStmt, nkIfExpr: genIf(c, n, dest)
|
||||
of nkWhenStmt:
|
||||
# This is "when nimvm" node. Chose the first branch.
|
||||
gen(c, n.sons[0].sons[1], dest)
|
||||
# This is "when nimvm" node. Chose the first branch.
|
||||
gen(c, n.sons[0].sons[1], dest)
|
||||
of nkCaseStmt: genCase(c, n, dest)
|
||||
of nkWhileStmt:
|
||||
unused(n, dest)
|
||||
@@ -1796,7 +1841,7 @@ proc gen(c: PCtx; n: PNode; dest: var TDest; flags: TGenFlags = {}) =
|
||||
for x in n: gen(c, x)
|
||||
of nkStmtListExpr:
|
||||
let L = n.len-1
|
||||
for i in 0 .. <L: gen(c, n.sons[i])
|
||||
for i in 0 ..< L: gen(c, n.sons[i])
|
||||
gen(c, n.sons[L], dest, flags)
|
||||
of nkPragmaBlock:
|
||||
gen(c, n.lastSon, dest, flags)
|
||||
@@ -1810,7 +1855,7 @@ proc gen(c: PCtx; n: PNode; dest: var TDest; flags: TGenFlags = {}) =
|
||||
of nkVarSection, nkLetSection:
|
||||
unused(n, dest)
|
||||
genVarSection(c, n)
|
||||
of declarativeDefs:
|
||||
of declarativeDefs, nkMacroDef:
|
||||
unused(n, dest)
|
||||
of nkLambdaKinds:
|
||||
#let s = n.sons[namePos].sym
|
||||
@@ -1842,9 +1887,11 @@ proc gen(c: PCtx; n: PNode; dest: var TDest; flags: TGenFlags = {}) =
|
||||
if allowCast in c.features:
|
||||
genConv(c, n, n.sons[1], dest, opcCast)
|
||||
else:
|
||||
globalError(n.info, errGenerated, "VM is not allowed to 'cast'")
|
||||
genIntCast(c, n, dest)
|
||||
of nkTypeOfExpr:
|
||||
genTypeLit(c, n.typ, dest)
|
||||
of nkComesFrom:
|
||||
discard "XXX to implement for better stack traces"
|
||||
else:
|
||||
globalError(n.info, errGenerated, "cannot generate VM code for " & $n)
|
||||
|
||||
@@ -1882,7 +1929,7 @@ proc genExpr*(c: PCtx; n: PNode, requiresValue = true): int =
|
||||
proc genParams(c: PCtx; params: PNode) =
|
||||
# res.sym.position is already 0
|
||||
c.prc.slots[0] = (inUse: true, kind: slotFixedVar)
|
||||
for i in 1.. <params.len:
|
||||
for i in 1..<params.len:
|
||||
c.prc.slots[i] = (inUse: true, kind: slotFixedLet)
|
||||
c.prc.maxSlots = max(params.len, 1)
|
||||
|
||||
@@ -1895,7 +1942,7 @@ proc finalJumpTarget(c: PCtx; pc, diff: int) =
|
||||
|
||||
proc genGenericParams(c: PCtx; gp: PNode) =
|
||||
var base = c.prc.maxSlots
|
||||
for i in 0.. <gp.len:
|
||||
for i in 0..<gp.len:
|
||||
var param = gp.sons[i].sym
|
||||
param.position = base + i # XXX: fix this earlier; make it consistent with templates
|
||||
c.prc.slots[base + i] = (inUse: true, kind: slotFixedLet)
|
||||
@@ -1903,7 +1950,7 @@ proc genGenericParams(c: PCtx; gp: PNode) =
|
||||
|
||||
proc optimizeJumps(c: PCtx; start: int) =
|
||||
const maxIterations = 10
|
||||
for i in start .. <c.code.len:
|
||||
for i in start ..< c.code.len:
|
||||
let opc = c.code[i].opcode
|
||||
case opc
|
||||
of opcTJmp, opcFJmp:
|
||||
|
||||
@@ -78,7 +78,7 @@ proc storeAny(s: var string; t: PType; a: PNode; stored: var IntSet) =
|
||||
s.add("]")
|
||||
of tyTuple:
|
||||
s.add("{")
|
||||
for i in 0.. <t.len:
|
||||
for i in 0..<t.len:
|
||||
if i > 0: s.add(", ")
|
||||
s.add("\"Field" & $i)
|
||||
s.add("\": ")
|
||||
@@ -90,7 +90,7 @@ proc storeAny(s: var string; t: PType; a: PNode; stored: var IntSet) =
|
||||
s.add("}")
|
||||
of tySet:
|
||||
s.add("[")
|
||||
for i in 0.. <a.len:
|
||||
for i in 0..<a.len:
|
||||
if i > 0: s.add(", ")
|
||||
if a[i].kind == nkRange:
|
||||
var x = copyNode(a[i][0])
|
||||
|
||||
@@ -47,6 +47,11 @@ template wrap1s_ospaths(op) {.dirty.} =
|
||||
setResult(a, op(getString(a, 0)))
|
||||
ospathsop op
|
||||
|
||||
template wrap2s_ospaths(op) {.dirty.} =
|
||||
proc `op Wrapper`(a: VmArgs) {.nimcall.} =
|
||||
setResult(a, op(getString(a, 0), getString(a, 1)))
|
||||
ospathsop op
|
||||
|
||||
template wrap1s_system(op) {.dirty.} =
|
||||
proc `op Wrapper`(a: VmArgs) {.nimcall.} =
|
||||
setResult(a, op(getString(a, 0)))
|
||||
@@ -96,7 +101,7 @@ proc registerAdditionalOps*(c: PCtx) =
|
||||
wrap1f_math(ceil)
|
||||
wrap2f_math(fmod)
|
||||
|
||||
wrap1s_ospaths(getEnv)
|
||||
wrap2s_ospaths(getEnv)
|
||||
wrap1s_ospaths(existsEnv)
|
||||
wrap1s_os(dirExists)
|
||||
wrap1s_os(fileExists)
|
||||
|
||||
@@ -21,11 +21,11 @@ type
|
||||
TSpecialWord* = enum
|
||||
wInvalid,
|
||||
|
||||
wAddr, wAnd, wAs, wAsm, wAtomic,
|
||||
wAddr, wAnd, wAs, wAsm,
|
||||
wBind, wBlock, wBreak, wCase, wCast, wConcept, wConst,
|
||||
wContinue, wConverter, wDefer, wDiscard, wDistinct, wDiv, wDo,
|
||||
wElif, wElse, wEnd, wEnum, wExcept, wExport,
|
||||
wFinally, wFor, wFrom, wFunc, wGeneric, wIf, wImport, wIn,
|
||||
wFinally, wFor, wFrom, wFunc, wIf, wImport, wIn,
|
||||
wInclude, wInterface, wIs, wIsnot, wIterator, wLet,
|
||||
wMacro, wMethod, wMixin, wMod, wNil,
|
||||
wNot, wNotin, wObject, wOf, wOr, wOut, wProc, wPtr, wRaise, wRef, wReturn,
|
||||
@@ -45,7 +45,7 @@ type
|
||||
wImportc, wExportc, wExportNims, wIncompleteStruct, wRequiresInit,
|
||||
wAlign, wNodecl, wPure, wSideeffect, wHeader,
|
||||
wNosideeffect, wGcSafe, wNoreturn, wMerge, wLib, wDynlib,
|
||||
wCompilerproc, wProcVar, wBase, wUsed,
|
||||
wCompilerproc, wCore, wProcVar, wBase, wUsed,
|
||||
wFatal, wError, wWarning, wHint, wLine, wPush, wPop, wDefine, wUndef,
|
||||
wLinedir, wStacktrace, wLinetrace, wLink, wCompile,
|
||||
wLinksys, wDeprecated, wVarargs, wCallconv, wBreakpoint, wDebugger,
|
||||
@@ -55,7 +55,7 @@ type
|
||||
wFloatchecks, wNanChecks, wInfChecks,
|
||||
wAssertions, wPatterns, wWarnings,
|
||||
wHints, wOptimization, wRaises, wWrites, wReads, wSize, wEffects, wTags,
|
||||
wDeadCodeElim, wSafecode, wNoForward, wReorder, wNoRewrite,
|
||||
wDeadCodeElim, wSafecode, wPackage, wNoForward, wReorder, wNoRewrite,
|
||||
wPragma,
|
||||
wCompileTime, wNoInit,
|
||||
wPassc, wPassl, wBorrow, wDiscardable,
|
||||
@@ -66,7 +66,7 @@ type
|
||||
wWrite, wGensym, wInject, wDirty, wInheritable, wThreadVar, wEmit,
|
||||
wAsmNoStackFrame,
|
||||
wImplicitStatic, wGlobal, wCodegenDecl, wUnchecked, wGuard, wLocks,
|
||||
wPartial, wExplain,
|
||||
wPartial, wExplain, wLiftLocals,
|
||||
|
||||
wAuto, wBool, wCatch, wChar, wClass,
|
||||
wConst_cast, wDefault, wDelete, wDouble, wDynamic_cast,
|
||||
@@ -103,12 +103,12 @@ const
|
||||
|
||||
specialWords*: array[low(TSpecialWord)..high(TSpecialWord), string] = ["",
|
||||
|
||||
"addr", "and", "as", "asm", "atomic",
|
||||
"addr", "and", "as", "asm",
|
||||
"bind", "block", "break", "case", "cast",
|
||||
"concept", "const", "continue", "converter",
|
||||
"defer", "discard", "distinct", "div", "do",
|
||||
"elif", "else", "end", "enum", "except", "export",
|
||||
"finally", "for", "from", "func", "generic", "if",
|
||||
"finally", "for", "from", "func", "if",
|
||||
"import", "in", "include", "interface", "is", "isnot", "iterator",
|
||||
"let",
|
||||
"macro", "method", "mixin", "mod", "nil", "not", "notin",
|
||||
@@ -131,7 +131,7 @@ const
|
||||
"incompletestruct",
|
||||
"requiresinit", "align", "nodecl", "pure", "sideeffect",
|
||||
"header", "nosideeffect", "gcsafe", "noreturn", "merge", "lib", "dynlib",
|
||||
"compilerproc", "procvar", "base", "used",
|
||||
"compilerproc", "core", "procvar", "base", "used",
|
||||
"fatal", "error", "warning", "hint", "line",
|
||||
"push", "pop", "define", "undef", "linedir", "stacktrace", "linetrace",
|
||||
"link", "compile", "linksys", "deprecated", "varargs",
|
||||
@@ -143,7 +143,7 @@ const
|
||||
|
||||
"assertions", "patterns", "warnings", "hints",
|
||||
"optimization", "raises", "writes", "reads", "size", "effects", "tags",
|
||||
"deadcodeelim", "safecode", "noforward", "reorder", "norewrite",
|
||||
"deadcodeelim", "safecode", "package", "noforward", "reorder", "norewrite",
|
||||
"pragma",
|
||||
"compiletime", "noinit",
|
||||
"passc", "passl", "borrow", "discardable", "fieldchecks",
|
||||
@@ -152,7 +152,7 @@ const
|
||||
"computedgoto", "injectstmt", "experimental",
|
||||
"write", "gensym", "inject", "dirty", "inheritable", "threadvar", "emit",
|
||||
"asmnostackframe", "implicitstatic", "global", "codegendecl", "unchecked",
|
||||
"guard", "locks", "partial", "explain",
|
||||
"guard", "locks", "partial", "explain", "liftlocals",
|
||||
|
||||
"auto", "bool", "catch", "char", "class",
|
||||
"const_cast", "default", "delete", "double",
|
||||
|
||||
@@ -123,7 +123,7 @@ proc returnsNewExpr*(n: PNode): NewLocation =
|
||||
of nkCurly, nkBracket, nkPar, nkObjConstr, nkClosure,
|
||||
nkIfExpr, nkIfStmt, nkWhenStmt, nkCaseStmt, nkTryStmt:
|
||||
result = newLit
|
||||
for i in ord(n.kind == nkObjConstr) .. <n.len:
|
||||
for i in ord(n.kind == nkObjConstr) ..< n.len:
|
||||
let x = returnsNewExpr(n.sons[i])
|
||||
case x
|
||||
of newNone: return newNone
|
||||
|
||||
@@ -98,7 +98,6 @@ path="$lib/pure"
|
||||
clang.options.linker = "-landroid-glob"
|
||||
clang.cpp.options.linker = "-landroid-glob"
|
||||
tcc.options.linker = "-landroid-glob"
|
||||
define:"useShPath:/system/bin/sh"
|
||||
@end
|
||||
@end
|
||||
|
||||
@@ -202,6 +201,11 @@ vcc.cpp.linkerexe = "vccexe.exe"
|
||||
|
||||
# set the options for specific platforms:
|
||||
vcc.options.always = "/nologo"
|
||||
@if release:
|
||||
# no debug symbols in release builds
|
||||
@else:
|
||||
vcc.options.always %= "${vcc.options.always} /Z7" # Get VCC to output full debug symbols in the obj file
|
||||
@end
|
||||
vcc.cpp.options.always %= "${vcc.options.always} /EHsc"
|
||||
vcc.options.linker = "/nologo /DEBUG /Zi /F33554432" # set the stack size to 32 MiB
|
||||
vcc.cpp.options.linker %= "${vcc.options.linker}"
|
||||
@@ -222,8 +226,8 @@ vcc.options.linker %= "--platform:arm ${vcc.options.linker}"
|
||||
vcc.cpp.options.linker %= "--platform:arm ${vcc.cpp.options.linker}"
|
||||
@end
|
||||
|
||||
vcc.options.debug = "/Zi /FS /Od"
|
||||
vcc.cpp.options.debug = "/Zi /FS /Od"
|
||||
vcc.options.debug = "/Od"
|
||||
vcc.cpp.options.debug = "/Od"
|
||||
vcc.options.speed = "/O2"
|
||||
vcc.cpp.options.speed = "/O2"
|
||||
vcc.options.size = "/O1"
|
||||
|
||||
@@ -88,10 +88,58 @@ doc.body_toc = """
|
||||
</div>
|
||||
"""
|
||||
|
||||
@if boot:
|
||||
# This is enabled with the "boot" directive to generate
|
||||
# the compiler documentation.
|
||||
# As a user, tweak the block below instead.
|
||||
# You can add your own global-links entries
|
||||
doc.body_toc_group = """
|
||||
<div class="row">
|
||||
<div class="three columns">
|
||||
<div>
|
||||
<div id="global-links">
|
||||
<ul class="simple">
|
||||
<li>
|
||||
<a href="manual.html">Manual</a>
|
||||
</li>
|
||||
<li>
|
||||
<a href="lib.html">Standard library</a>
|
||||
</li>
|
||||
<li>
|
||||
<a href="theindex.html">Index</a>
|
||||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
<div id="searchInput">
|
||||
Search: <input type="text" id="searchInput"
|
||||
onkeyup="search()" />
|
||||
</div>
|
||||
<div class="search-groupby">
|
||||
Group by:
|
||||
<select onchange="groupBy(this.value)">
|
||||
<option value="section">Section</option>
|
||||
<option value="type">Type</option>
|
||||
</select>
|
||||
</div>
|
||||
$tableofcontents
|
||||
</div>
|
||||
<div class="nine columns" id="content">
|
||||
<div id="tocRoot"></div>
|
||||
<p class="module-desc">$moduledesc</p>
|
||||
$content
|
||||
</div>
|
||||
</div>
|
||||
"""
|
||||
|
||||
@else
|
||||
|
||||
doc.body_toc_group = """
|
||||
<div class="row">
|
||||
<div class="three columns">
|
||||
<div id="global-links">
|
||||
<ul class="simple">
|
||||
</ul>
|
||||
</div>
|
||||
<div id="searchInput">
|
||||
Search: <input type="text" id="searchInput"
|
||||
onkeyup="search()" />
|
||||
</div>
|
||||
@@ -111,6 +159,7 @@ doc.body_toc_group = """
|
||||
</div>
|
||||
</div>
|
||||
"""
|
||||
@end
|
||||
|
||||
doc.body_no_toc = """
|
||||
$moduledesc
|
||||
@@ -135,7 +184,7 @@ doc.file = """<?xml version="1.0" encoding="utf-8" ?>
|
||||
<link rel="shortcut icon" href="data:image/x-icon;base64,AAABAAEAEBAAAAEAIABoBAAAFgAAACgAAAAQAAAAIAAAAAEAIAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AAAAAAUAAAAF////AP///wD///8A////AP///wD///8A////AP///wD///8A////AAAAAAIAAABbAAAAlQAAAKIAAACbAAAAmwAAAKIAAACVAAAAWwAAAAL///8A////AP///wD///8A////AAAAABQAAADAAAAAYwAAAA3///8A////AP///wD///8AAAAADQAAAGMAAADAAAAAFP///wD///8A////AP///wAAAACdAAAAOv///wD///8A////AP///wD///8A////AP///wD///8AAAAAOgAAAJ3///8A////AP///wAAAAAnAAAAcP///wAAAAAoAAAASv///wD///8A////AP///wAAAABKAAAAKP///wAAAABwAAAAJ////wD///8AAAAAgQAAABwAAACIAAAAkAAAAJMAAACtAAAAFQAAABUAAACtAAAAkwAAAJAAAACIAAAAHAAAAIH///8A////AAAAAKQAAACrAAAAaP///wD///8AAAAARQAAANIAAADSAAAARf///wD///8AAAAAaAAAAKsAAACk////AAAAADMAAACcAAAAnQAAABj///8A////AP///wAAAAAYAAAAGP///wD///8A////AAAAABgAAACdAAAAnAAAADMAAAB1AAAAwwAAAP8AAADpAAAAsQAAAE4AAAAb////AP///wAAAAAbAAAATgAAALEAAADpAAAA/wAAAMMAAAB1AAAAtwAAAOkAAAD/AAAA/wAAAP8AAADvAAAA3gAAAN4AAADeAAAA3gAAAO8AAAD/AAAA/wAAAP8AAADpAAAAtwAAAGUAAAA/AAAA3wAAAP8AAAD/AAAA/wAAAP8AAAD/AAAA/wAAAP8AAAD/AAAA/wAAAP8AAADfAAAAPwAAAGX///8A////AAAAAEgAAADtAAAAvwAAAL0AAADGAAAA7wAAAO8AAADGAAAAvQAAAL8AAADtAAAASP///wD///8A////AP///wD///8AAAAAO////wD///8A////AAAAAIcAAACH////AP///wD///8AAAAAO////wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A//8AAP//AAD4HwAA7/cAAN/7AAD//wAAoYUAAJ55AACf+QAAh+EAAAAAAADAAwAA4AcAAP5/AAD//wAA//8AAA=="/>
|
||||
|
||||
<!-- Google fonts -->
|
||||
<link href='https://fonts.googleapis.com/css?family=Raleway:400,600,900' rel='stylesheet' type='text/css'/>
|
||||
<link href='https://fonts.googleapis.com/css?family=Lato:400,600,900' rel='stylesheet' type='text/css'/>
|
||||
<link href='https://fonts.googleapis.com/css?family=Source+Code+Pro:400,500,600' rel='stylesheet' type='text/css'/>
|
||||
|
||||
<!-- CSS -->
|
||||
@@ -168,18 +217,19 @@ html {
|
||||
|
||||
/* Where we want fancier font if available */
|
||||
h1, h2, h3, h4, h5, h6, p.module-desc, table.docinfo + blockquote p, table.docinfo blockquote p, h1 + blockquote p {
|
||||
font-family: "Raleway", "Helvetica Neue", "HelveticaNeue", Helvetica, Arial, sans-serif !important; }
|
||||
font-family: "Lato", "Helvetica Neue", "HelveticaNeue", Helvetica, Arial, sans-serif !important; }
|
||||
|
||||
h1.title {
|
||||
font-weight: 900; }
|
||||
|
||||
body {
|
||||
font-family: "Helvetica Neue", "HelveticaNeue", "Raleway", Helvetica, Arial, sans-serif;
|
||||
font-family: "Lato", "Helvetica Neue", "HelveticaNeue", Helvetica, Arial, sans-serif;
|
||||
font-weight: 400;
|
||||
font-size: 14px;
|
||||
font-size: 16px;
|
||||
line-height: 20px;
|
||||
color: #666;
|
||||
background-color: rgba(252, 248, 244, 0.75); }
|
||||
color: #444;
|
||||
letter-spacing: 0.15px;
|
||||
background-color: rgba(252, 248, 244, 0.45); }
|
||||
|
||||
/* Skeleton grid */
|
||||
.container {
|
||||
@@ -295,8 +345,8 @@ cite {
|
||||
font-style: italic !important; }
|
||||
|
||||
dt > pre {
|
||||
border-color: rgba(0, 0, 0, 0.15);
|
||||
background-color: transparent;
|
||||
border-color: rgba(0, 0, 0, 0.1);
|
||||
background-color: rgba(255, 255, 255, 0.3);
|
||||
margin: 15px 0px 5px; }
|
||||
|
||||
dd > pre {
|
||||
@@ -313,6 +363,17 @@ dd > pre {
|
||||
width: 100%;
|
||||
table-layout: fixed; }
|
||||
|
||||
/* Nim search input */
|
||||
div#searchInput {
|
||||
margin-bottom: 8px;
|
||||
}
|
||||
div#searchInput input#searchInput {
|
||||
width: 10em;
|
||||
}
|
||||
div.search-groupby {
|
||||
margin-bottom: 8px;
|
||||
}
|
||||
|
||||
table.line-nums-table {
|
||||
border-radius: 4px;
|
||||
border: 1px solid #cccccc;
|
||||
@@ -456,7 +517,7 @@ img {
|
||||
box-shadow: 0 1px 3px rgba(0, 0, 0, 0.1); }
|
||||
|
||||
p {
|
||||
margin: 0 0 12px; }
|
||||
margin: 0 0 8px; }
|
||||
|
||||
small {
|
||||
font-size: 85%; }
|
||||
@@ -476,7 +537,7 @@ h3,
|
||||
h4,
|
||||
h5,
|
||||
h6 {
|
||||
font-family: "Helvetica Neue", "HelveticaNeue", "Raleway", Helvetica, Arial, sans-serif;
|
||||
font-family: "Lato", "Helvetica Neue", "HelveticaNeue", Helvetica, Arial, sans-serif;
|
||||
font-weight: 600;
|
||||
line-height: 20px;
|
||||
color: inherit;
|
||||
@@ -484,6 +545,7 @@ h6 {
|
||||
|
||||
h1 {
|
||||
font-size: 2em;
|
||||
font-weight: 400;
|
||||
padding-bottom: .15em;
|
||||
border-bottom: 1px solid #aaaaaa;
|
||||
margin-top: 1.0em;
|
||||
@@ -614,13 +676,13 @@ pre {
|
||||
box-sizing: border-box;
|
||||
min-width: calc(100% - 19.5px);
|
||||
padding: 9.5px;
|
||||
margin: 0.25em 10px 0.25em 10px;
|
||||
font-size: 14px;
|
||||
margin: 0.25em 10px 10px 10px;
|
||||
font-size: 15px;
|
||||
line-height: 20px;
|
||||
white-space: pre !important;
|
||||
overflow-y: hidden;
|
||||
overflow-x: visible;
|
||||
background-color: whitesmoke;
|
||||
background-color: rgba(0, 0, 0, 0.01);
|
||||
border: 1px solid #cccccc;
|
||||
-webkit-border-radius: 4px;
|
||||
-moz-border-radius: 4px;
|
||||
@@ -899,14 +961,14 @@ div.admonition p.admonition-title, div.hint p.admonition-title,
|
||||
div.important p.admonition-title, div.note p.admonition-title,
|
||||
div.tip p.admonition-title {
|
||||
font-weight: bold;
|
||||
font-family: "Helvetica Neue", "HelveticaNeue", "Raleway", Helvetica, Arial, sans-serif; }
|
||||
font-family: "Lato", "Helvetica Neue", "HelveticaNeue", Helvetica, Arial, sans-serif; }
|
||||
|
||||
div.attention p.admonition-title, div.caution p.admonition-title,
|
||||
div.danger p.admonition-title, div.error p.admonition-title,
|
||||
div.warning p.admonition-title, .code .error {
|
||||
color: #b30000;
|
||||
font-weight: bold;
|
||||
font-family: "Helvetica Neue", "HelveticaNeue", "Raleway", Helvetica, Arial, sans-serif; }
|
||||
font-family: "Lato", "Helvetica Neue", "HelveticaNeue", Helvetica, Arial, sans-serif; }
|
||||
|
||||
/* Uncomment (and remove this text!) to get reduced vertical space in
|
||||
compound paragraphs.
|
||||
@@ -953,7 +1015,7 @@ div.sidebar {
|
||||
clear: right; }
|
||||
|
||||
div.sidebar p.rubric {
|
||||
font-family: "Helvetica Neue", "HelveticaNeue", "Raleway", Helvetica, Arial, sans-serif;
|
||||
font-family: "Lato", "Helvetica Neue", "HelveticaNeue", Helvetica, Arial, sans-serif;
|
||||
font-size: medium; }
|
||||
|
||||
div.system-messages {
|
||||
@@ -1060,12 +1122,12 @@ p.rubric {
|
||||
text-align: center; }
|
||||
|
||||
p.sidebar-title {
|
||||
font-family: "Helvetica Neue", "HelveticaNeue", "Raleway", Helvetica, Arial, sans-serif;
|
||||
font-family: "Lato", "Helvetica Neue", "HelveticaNeue", Helvetica, Arial, sans-serif;
|
||||
font-weight: bold;
|
||||
font-size: larger; }
|
||||
|
||||
p.sidebar-subtitle {
|
||||
font-family: "Helvetica Neue", "HelveticaNeue", "Raleway", Helvetica, Arial, sans-serif;
|
||||
font-family: "Lato", "Helvetica Neue", "HelveticaNeue", Helvetica, Arial, sans-serif;
|
||||
font-weight: bold; }
|
||||
|
||||
p.topic-title {
|
||||
@@ -1107,15 +1169,15 @@ pre.code .inserted, code .inserted {
|
||||
background-color: #A3D289; }
|
||||
|
||||
span.classifier {
|
||||
font-family: "Helvetica Neue", "HelveticaNeue", "Raleway", Helvetica, Arial, sans-serif;
|
||||
font-family: "Lato", "Helvetica Neue", "HelveticaNeue", Helvetica, Arial, sans-serif;
|
||||
font-style: oblique; }
|
||||
|
||||
span.classifier-delimiter {
|
||||
font-family: "Helvetica Neue", "HelveticaNeue", "Raleway", Helvetica, Arial, sans-serif;
|
||||
font-family: "Lato", "Helvetica Neue", "HelveticaNeue", Helvetica, Arial, sans-serif;
|
||||
font-weight: bold; }
|
||||
|
||||
span.interpreted {
|
||||
font-family: "Helvetica Neue", "HelveticaNeue", "Raleway", Helvetica, Arial, sans-serif; }
|
||||
font-family: "Lato", "Helvetica Neue", "HelveticaNeue", Helvetica, Arial, sans-serif; }
|
||||
|
||||
span.option {
|
||||
white-space: nowrap; }
|
||||
@@ -1138,7 +1200,7 @@ table.docinfo {
|
||||
margin: 0em;
|
||||
margin-top: 2em;
|
||||
margin-bottom: 2em;
|
||||
font-family: "Raleway", "Helvetica Neue", "HelveticaNeue", Helvetica, Arial, sans-serif !important;
|
||||
font-family: "Lato", "Helvetica Neue", "HelveticaNeue", Helvetica, Arial, sans-serif !important;
|
||||
color: #444444; }
|
||||
|
||||
table.docutils {
|
||||
@@ -1268,15 +1330,15 @@ dt pre > span.Operator ~ span.Identifier, dt pre > span.Operator ~ span.Operator
|
||||
background-repeat: no-repeat;
|
||||
background-image: url("data:image/x-icon;base64,AAABAAEAEBAAAAEAIABoBAAAFgAAACgAAAAQAAAAIAAAAAEAIAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AAAAAAUAAAAF////AP///wD///8A////AP///wD///8A////AP///wD///8A////AAAAAAIAAABbAAAAlQAAAKIAAACbAAAAmwAAAKIAAACVAAAAWwAAAAL///8A////AP///wD///8A////AAAAABQAAADAAAAAYwAAAA3///8A////AP///wD///8AAAAADQAAAGMAAADAAAAAFP///wD///8A////AP///wAAAACdAAAAOv///wD///8A////AP///wD///8A////AP///wD///8AAAAAOgAAAJ3///8A////AP///wAAAAAnAAAAcP///wAAAAAoAAAASv///wD///8A////AP///wAAAABKAAAAKP///wAAAABwAAAAJ////wD///8AAAAAgQAAABwAAACIAAAAkAAAAJMAAACtAAAAFQAAABUAAACtAAAAkwAAAJAAAACIAAAAHAAAAIH///8A////AAAAAKQAAACrAAAAaP///wD///8AAAAARQAAANIAAADSAAAARf///wD///8AAAAAaAAAAKsAAACk////AAAAADMAAACcAAAAnQAAABj///8A////AP///wAAAAAYAAAAGP///wD///8A////AAAAABgAAACdAAAAnAAAADMAAAB1AAAAwwAAAP8AAADpAAAAsQAAAE4AAAAb////AP///wAAAAAbAAAATgAAALEAAADpAAAA/wAAAMMAAAB1AAAAtwAAAOkAAAD/AAAA/wAAAP8AAADvAAAA3gAAAN4AAADeAAAA3gAAAO8AAAD/AAAA/wAAAP8AAADpAAAAtwAAAGUAAAA/AAAA3wAAAP8AAAD/AAAA/wAAAP8AAAD/AAAA/wAAAP8AAAD/AAAA/wAAAP8AAADfAAAAPwAAAGX///8A////AAAAAEgAAADtAAAAvwAAAL0AAADGAAAA7wAAAO8AAADGAAAAvQAAAL8AAADtAAAASP///wD///8A////AP///wD///8AAAAAO////wD///8A////AAAAAIcAAACH////AP///wD///8AAAAAO////wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A////AP///wD///8A//8AAP//AAD4HwAA7/cAAN/7AAD//wAAoYUAAJ55AACf+QAAh+EAAAAAAADAAwAA4AcAAP5/AAD//wAA//8AAA==");
|
||||
margin-bottom: -5px; }
|
||||
div.pragma {
|
||||
display: none;
|
||||
}
|
||||
span.pragmabegin {
|
||||
cursor: pointer;
|
||||
}
|
||||
span.pragmaend {
|
||||
cursor: pointer;
|
||||
}
|
||||
div.pragma {
|
||||
display: none;
|
||||
}
|
||||
span.pragmabegin {
|
||||
cursor: pointer;
|
||||
}
|
||||
span.pragmaend {
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
div.search_results {
|
||||
background-color: antiquewhite;
|
||||
@@ -1284,6 +1346,11 @@ div.search_results {
|
||||
padding: 1em;
|
||||
border: 1px solid #4d4d4d;
|
||||
}
|
||||
|
||||
div#global-links ul {
|
||||
margin-left: 0;
|
||||
list-style-type: none;
|
||||
}
|
||||
</style>
|
||||
|
||||
<script type="text/javascript" src="../dochack.js"></script>
|
||||
|
||||
@@ -7,7 +7,8 @@ Advanced commands:
|
||||
//rst2html convert a reStructuredText file to HTML
|
||||
//rst2tex convert a reStructuredText file to TeX
|
||||
//jsondoc extract the documentation to a json file
|
||||
//jsondoc2 extract documentation to a json file (uses doc2)
|
||||
//jsondoc2 extract the documentation to a json file (uses doc2)
|
||||
//ctags create a tags file
|
||||
//buildIndex build an index for the whole documentation
|
||||
//run run the project (with Tiny C backend; buggy!)
|
||||
//genDepend generate a DOT file containing the
|
||||
@@ -36,6 +37,7 @@ Advanced options:
|
||||
--noMain do not generate a main procedure
|
||||
--genScript generate a compile script (in the 'nimcache'
|
||||
subdirectory named 'compile_$project$scriptext')
|
||||
--genDeps generate a '.deps' file containing the dependencies
|
||||
--os:SYMBOL set the target operating system (cross-compilation)
|
||||
--cpu:SYMBOL set the target processor (cross-compilation)
|
||||
--debuginfo enables debug information
|
||||
@@ -78,6 +80,7 @@ Advanced options:
|
||||
symbol matching is fuzzy so
|
||||
that --dynlibOverride:lua matches
|
||||
dynlib: "liblua.so.3"
|
||||
--dynlibOverrideAll makes the dynlib pragma have no effect
|
||||
--listCmd list the commands used to execute external programs
|
||||
--parallelBuild:0|1|... perform a parallel build
|
||||
value = number of processors (0 for auto-detect)
|
||||
|
||||
@@ -20,10 +20,12 @@ to compile to C++, Objective-C or JavaScript. This document tries to
|
||||
concentrate in a single place all the backend and interfacing options.
|
||||
|
||||
The Nim compiler supports mainly two backend families: the C, C++ and
|
||||
Objective-C targets and the JavaScript target. `The C like targets`_ creates
|
||||
source files which can be compiled into a library or a final executable. `The
|
||||
JavaScript target`_ can generate a ``.js`` file which you reference from an
|
||||
HTML file or create a `standalone nodejs program <http://nodejs.org>`_.
|
||||
Objective-C targets and the JavaScript target. `The C like targets
|
||||
<#backends-the-c-like-targets>`_ creates source files which can be compiled
|
||||
into a library or a final executable. `The JavaScript target
|
||||
<#backends-the-javascript-target>`_ can generate a ``.js`` file which you
|
||||
reference from an HTML file or create a `standalone nodejs program
|
||||
<http://nodejs.org>`_.
|
||||
|
||||
On top of generating libraries or standalone applications, Nim offers
|
||||
bidirectional interfacing with the backend targets through generic and
|
||||
@@ -205,9 +207,10 @@ from the previous section):
|
||||
|
||||
Compile the Nim code to JavaScript with ``nim js -o:calculator.js
|
||||
calculator.nim`` and open ``host.html`` in a browser. If the browser supports
|
||||
javascript, you should see the value ``10``. In JavaScript the `echo proc
|
||||
<system.html#echo>`_ will modify the HTML DOM and append the string. Use the
|
||||
`dom module <dom.html>`_ for specific DOM querying and modification procs.
|
||||
javascript, you should see the value ``10`` in the browser's console. Use the
|
||||
`dom module <dom.html>`_ for specific DOM querying and modification procs
|
||||
or take a look at `karax <https://github.com/pragmagic/karax>`_ for how to
|
||||
develop browser based applications.
|
||||
|
||||
|
||||
Backend code calling Nim
|
||||
|
||||
@@ -5,7 +5,6 @@
|
||||
Command:
|
||||
//compile, c compile project with default code generator (C)
|
||||
//doc generate the documentation for inputfile
|
||||
//doc2 generate the documentation for inputfile
|
||||
|
||||
Arguments:
|
||||
arguments are passed to the program being run (if --run option is selected)
|
||||
|
||||
@@ -83,7 +83,8 @@ paramListColon = paramList? (':' optInd typeDesc)?
|
||||
doBlock = 'do' paramListArrow pragmas? colcom stmt
|
||||
procExpr = 'proc' paramListColon pragmas? ('=' COMMENT? stmt)?
|
||||
distinct = 'distinct' optInd typeDesc
|
||||
expr = (ifExpr
|
||||
expr = (blockExpr
|
||||
| ifExpr
|
||||
| whenExpr
|
||||
| caseExpr
|
||||
| tryExpr)
|
||||
@@ -141,6 +142,7 @@ tryExpr = 'try' colcom stmt &(optInd 'except'|'finally')
|
||||
exceptBlock = 'except' colcom stmt
|
||||
forStmt = 'for' (identWithPragma ^+ comma) 'in' expr colcom stmt
|
||||
blockStmt = 'block' symbol? colcom stmt
|
||||
blockExpr = 'block' symbol? colcom stmt
|
||||
staticStmt = 'static' colcom stmt
|
||||
deferStmt = 'defer' colcom stmt
|
||||
asmStmt = 'asm' pragma? (STR_LIT | RSTR_LIT | TRIPLE_STR_LIT)
|
||||
|
||||
@@ -1,10 +1,9 @@
|
||||
addr and as asm atomic
|
||||
addr and as asm
|
||||
bind block break
|
||||
case cast concept const continue converter
|
||||
defer discard distinct div do
|
||||
elif else end enum except export
|
||||
finally for from func
|
||||
generic
|
||||
if import in include interface is isnot iterator
|
||||
let
|
||||
macro method mixin mod
|
||||
|
||||
27
doc/lib.rst
27
doc/lib.rst
@@ -64,6 +64,10 @@ Core
|
||||
* `cpuinfo <cpuinfo.html>`_
|
||||
This module implements procs to determine the number of CPUs / cores.
|
||||
|
||||
* `lenientops <lenientops.html>`_
|
||||
Provides binary operators for mixed integer/float expressions for convenience.
|
||||
|
||||
|
||||
|
||||
Collections and algorithms
|
||||
--------------------------
|
||||
@@ -88,6 +92,10 @@ Collections and algorithms
|
||||
* `sequtils <sequtils.html>`_
|
||||
This module implements operations for the built-in seq type
|
||||
which were inspired by functional programming languages.
|
||||
* `sharedtables <sharedtables.html>`_
|
||||
Nim shared hash table support. Contains shared tables.
|
||||
* `sharedlist <sharedlist.html>`_
|
||||
Nim shared linked list support. Contains shared singly linked list.
|
||||
|
||||
|
||||
String handling
|
||||
@@ -98,6 +106,10 @@ String handling
|
||||
case of a string, splitting a string into substrings, searching for
|
||||
substrings, replacing substrings.
|
||||
|
||||
* `strformat <strformat.html>`_
|
||||
Macro based standard string interpolation / formatting. Inpired by
|
||||
Python's ```f``-strings.
|
||||
|
||||
* `strmisc <strmisc.html>`_
|
||||
This module contains uncommon string handling operations that do not
|
||||
fit with the commonly used operations in strutils.
|
||||
@@ -182,6 +194,12 @@ Generic Operating System Services
|
||||
This module implements asynchronous file reading and writing using
|
||||
``asyncdispatch``.
|
||||
|
||||
* `distros <distros.html>`_
|
||||
This module implements the basics for OS distribution ("distro") detection and the OS's native package manager.
|
||||
Its primary purpose is to produce output for Nimble packages, but it also contains the widely used **Distribution** enum
|
||||
that is useful for writing platform specific code.
|
||||
|
||||
|
||||
Math libraries
|
||||
--------------
|
||||
|
||||
@@ -369,6 +387,7 @@ Cryptography and Hashing
|
||||
* `securehash <securehash.html>`_
|
||||
This module implements a sha1 encoder and decoder.
|
||||
|
||||
|
||||
Multimedia support
|
||||
------------------
|
||||
|
||||
@@ -409,6 +428,9 @@ Miscellaneous
|
||||
* `unittest <unittest.html>`_
|
||||
Implements a Unit testing DSL.
|
||||
|
||||
* `segfaults <segfaults.html>`_
|
||||
Turns access violations or segfaults into a ``NilAccessError`` exception.
|
||||
|
||||
Modules for JS backend
|
||||
---------------------------
|
||||
|
||||
@@ -418,6 +440,8 @@ Modules for JS backend
|
||||
* `jsffi <jsffi.html>`_
|
||||
Types and macros for easier interaction with JavaScript.
|
||||
|
||||
* `asyncjs <asyncjs.html>`_
|
||||
Types and macros for writing asynchronous procedures in JavaScript.
|
||||
|
||||
Deprecated modules
|
||||
------------------
|
||||
@@ -533,9 +557,6 @@ Database support
|
||||
Network Programming and Internet Protocols
|
||||
------------------------------------------
|
||||
|
||||
* `libuv <libuv.html>`_
|
||||
Wrapper for the libuv library used for async I/O programming.
|
||||
|
||||
* `joyent_http_parser <joyent_http_parser.html>`_
|
||||
Wrapper for the joyent's high-performance HTTP parser.
|
||||
|
||||
|
||||
@@ -206,7 +206,7 @@ strings, because they are precompiled.
|
||||
**Note**: Passing variables to the ``dynlib`` pragma will fail at runtime
|
||||
because of order of initialization problems.
|
||||
|
||||
**Note**: A ``dynlib`` import can be overriden with
|
||||
**Note**: A ``dynlib`` import can be overridden with
|
||||
the ``--dynlibOverride:name`` command line option. The Compiler User Guide
|
||||
contains further information.
|
||||
|
||||
|
||||
@@ -9,26 +9,26 @@ The following example shows a generic binary tree can be modelled:
|
||||
|
||||
.. code-block:: nim
|
||||
type
|
||||
BinaryTreeObj[T] = object # BinaryTreeObj is a generic type with
|
||||
# with generic param ``T``
|
||||
le, ri: BinaryTree[T] # left and right subtrees; may be nil
|
||||
data: T # the data stored in a node
|
||||
BinaryTree[T] = ref BinaryTreeObj[T] # a shorthand for notational convenience
|
||||
BinaryTree*[T] = ref object # BinaryTree is a generic type with
|
||||
# generic param ``T``
|
||||
le, ri: BinaryTree[T] # left and right subtrees; may be nil
|
||||
data: T # the data stored in a node
|
||||
|
||||
proc newNode[T](data: T): BinaryTree[T] = # constructor for a node
|
||||
proc newNode*[T](data: T): BinaryTree[T] =
|
||||
# constructor for a node
|
||||
new(result)
|
||||
result.data = data
|
||||
|
||||
proc add[T](root: var BinaryTree[T], n: BinaryTree[T]) =
|
||||
proc add*[T](root: var BinaryTree[T], n: BinaryTree[T]) =
|
||||
# insert a node into the tree
|
||||
if root == nil:
|
||||
root = n
|
||||
else:
|
||||
var it = root
|
||||
while it != nil:
|
||||
var c = cmp(it.data, n.data) # compare the data items; uses
|
||||
# the generic ``cmp`` proc that works for
|
||||
# any type that has a ``==`` and ``<``
|
||||
# operator
|
||||
# compare the data items; uses the generic ``cmp`` proc
|
||||
# that works for any type that has a ``==`` and ``<`` operator
|
||||
var c = cmp(it.data, n.data)
|
||||
if c < 0:
|
||||
if it.le == nil:
|
||||
it.le = n
|
||||
@@ -40,20 +40,28 @@ The following example shows a generic binary tree can be modelled:
|
||||
return
|
||||
it = it.ri
|
||||
|
||||
iterator inorder[T](root: BinaryTree[T]): T =
|
||||
# inorder traversal of a binary tree
|
||||
# recursive iterators are not yet implemented, so this does not work in
|
||||
# the current compiler!
|
||||
if root.le != nil: yield inorder(root.le)
|
||||
yield root.data
|
||||
if root.ri != nil: yield inorder(root.ri)
|
||||
proc add*[T](root: var BinaryTree[T], data: T) =
|
||||
# convenience proc:
|
||||
add(root, newNode(data))
|
||||
|
||||
iterator preorder*[T](root: BinaryTree[T]): T =
|
||||
# Preorder traversal of a binary tree.
|
||||
# Since recursive iterators are not yet implemented,
|
||||
# this uses an explicit stack (which is more efficient anyway):
|
||||
var stack: seq[BinaryTree[T]] = @[root]
|
||||
while stack.len > 0:
|
||||
var n = stack.pop()
|
||||
while n != nil:
|
||||
yield n.data
|
||||
add(stack, n.ri) # push right subtree onto the stack
|
||||
n = n.le # and follow the left pointer
|
||||
|
||||
var
|
||||
root: BinaryTree[string] # instantiate a BinaryTree with the type string
|
||||
add(root, newNode("hallo")) # instantiates generic procs ``newNode`` and
|
||||
add(root, newNode("world")) # ``add``
|
||||
for str in inorder(root):
|
||||
writeLine(stdout, str)
|
||||
root: BinaryTree[string] # instantiate a BinaryTree with ``string``
|
||||
add(root, newNode("hello")) # instantiates ``newNode`` and ``add``
|
||||
add(root, "world") # instantiates the second ``add`` proc
|
||||
for str in preorder(root):
|
||||
stdout.writeLine(str)
|
||||
|
||||
|
||||
Is operator
|
||||
|
||||
@@ -2,7 +2,7 @@ Guards and locks
|
||||
================
|
||||
|
||||
Apart from ``spawn`` and ``parallel`` Nim also provides all the common low level
|
||||
concurrency mechanisms like locks, atomic intristics or condition variables.
|
||||
concurrency mechanisms like locks, atomic intrinsics or condition variables.
|
||||
|
||||
Nim significantly improves on the safety of these features via additional
|
||||
pragmas:
|
||||
@@ -74,7 +74,7 @@ model low level lockfree mechanisms:
|
||||
|
||||
The ``locks`` pragma takes a list of lock expressions ``locks: [a, b, ...]``
|
||||
in order to support *multi lock* statements. Why these are essential is
|
||||
explained in the `lock levels`_ section.
|
||||
explained in the `lock levels <#guards-and-locks-lock-levels>`_ section.
|
||||
|
||||
|
||||
Protecting general locations
|
||||
|
||||
@@ -105,7 +105,7 @@ From import statement
|
||||
~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
After the ``from`` statement a module name follows followed by
|
||||
an ``import`` to list the symbols one likes to use without explict
|
||||
an ``import`` to list the symbols one likes to use without explicit
|
||||
full qualification:
|
||||
|
||||
.. code-block:: nim
|
||||
@@ -123,7 +123,7 @@ in ``module``.
|
||||
Export statement
|
||||
~~~~~~~~~~~~~~~~
|
||||
|
||||
An ``export`` statement can be used for symbol fowarding so that client
|
||||
An ``export`` statement can be used for symbol forwarding so that client
|
||||
modules don't need to import a module's dependencies:
|
||||
|
||||
.. code-block:: nim
|
||||
|
||||
@@ -70,7 +70,7 @@ compileTime pragma
|
||||
The ``compileTime`` pragma is used to mark a proc or variable to be used at
|
||||
compile time only. No code will be generated for it. Compile time procs are
|
||||
useful as helpers for macros. Since version 0.12.0 of the language, a proc
|
||||
that uses ``system.NimNode`` within its parameter types is implictly declared
|
||||
that uses ``system.NimNode`` within its parameter types is implicitly declared
|
||||
``compileTime``:
|
||||
|
||||
.. code-block:: nim
|
||||
@@ -102,6 +102,14 @@ collector to not consider objects of this type as part of a cycle:
|
||||
left, right: Node
|
||||
data: string
|
||||
|
||||
Or if we directly use a ref object:
|
||||
|
||||
.. code-block:: nim
|
||||
type
|
||||
Node = ref object {.acyclic, final.}
|
||||
left, right: Node
|
||||
data: string
|
||||
|
||||
In the example a tree structure is declared with the ``Node`` type. Note that
|
||||
the type definition is recursive and the GC has to assume that objects of
|
||||
this type may form a cyclic graph. The ``acyclic`` pragma passes the
|
||||
@@ -316,7 +324,7 @@ factor.
|
||||
immediate pragma
|
||||
----------------
|
||||
|
||||
See `Ordinary vs immediate templates`_.
|
||||
See `Typed vs untyped parameters`_.
|
||||
|
||||
|
||||
compilation option pragmas
|
||||
@@ -733,7 +741,8 @@ about the ``importcpp`` pragma pattern language. It is not necessary
|
||||
to know all the details described here.
|
||||
|
||||
|
||||
Similar to the `importc pragma for C <manual.html#importc-pragma>`_, the
|
||||
Similar to the `importc pragma for C
|
||||
<#foreign-function-interface-importc-pragma>`_, the
|
||||
``importcpp`` pragma can be used to import `C++`:idx: methods or C++ symbols
|
||||
in general. The generated code then uses the C++ method calling
|
||||
syntax: ``obj->method(arg)``. In combination with the ``header`` and ``emit``
|
||||
@@ -955,10 +964,11 @@ Produces:
|
||||
|
||||
ImportObjC pragma
|
||||
-----------------
|
||||
Similar to the `importc pragma for C <manual.html#importc-pragma>`_, the
|
||||
``importobjc`` pragma can be used to import `Objective C`:idx: methods. The
|
||||
generated code then uses the Objective C method calling syntax: ``[obj method
|
||||
param1: arg]``. In addition with the ``header`` and ``emit`` pragmas this
|
||||
Similar to the `importc pragma for C
|
||||
<#foreign-function-interface-importc-pragma>`_, the ``importobjc`` pragma can
|
||||
be used to import `Objective C`:idx: methods. The generated code then uses the
|
||||
Objective C method calling syntax: ``[obj method param1: arg]``.
|
||||
In addition with the ``header`` and ``emit`` pragmas this
|
||||
allows *sloppy* interfacing with libraries written in Objective C:
|
||||
|
||||
.. code-block:: Nim
|
||||
|
||||
@@ -142,10 +142,11 @@ The method call syntax conflicts with explicit generic instantiations:
|
||||
parsed as ``(x.p)[T]``.
|
||||
|
||||
**Future directions**: ``p[.T.]`` might be introduced as an alternative syntax
|
||||
to pass explict types to a generic and then ``x.p[.T.]`` can be parsed as
|
||||
to pass explicit types to a generic and then ``x.p[.T.]`` can be parsed as
|
||||
``x.(p[.T.])``.
|
||||
|
||||
See also: `Limitations of the method call syntax`_.
|
||||
See also: `Limitations of the method call syntax
|
||||
<#templates-limitations-of-the-method-call-syntax>`_.
|
||||
|
||||
|
||||
Properties
|
||||
@@ -178,7 +179,7 @@ different; for this a special setter syntax is needed:
|
||||
Command invocation syntax
|
||||
-------------------------
|
||||
|
||||
Routines can be invoked without the ``()`` if the call is syntatically
|
||||
Routines can be invoked without the ``()`` if the call is syntactically
|
||||
a statement. This command invocation syntax also works for
|
||||
expressions, but then only a single argument may follow. This restriction
|
||||
means ``echo f 1, f 2`` is parsed as ``echo(f(1), f(2))`` and not as
|
||||
|
||||
@@ -17,8 +17,8 @@ or dynamic file formats such as JSON or XML.
|
||||
When Nim encounters an expression that cannot be resolved by the
|
||||
standard overload resolution rules, the current scope will be searched
|
||||
for a dot operator that can be matched against a re-written form of
|
||||
the expression, where the unknown field or proc name is converted to
|
||||
an additional static string parameter:
|
||||
the expression, where the unknown field or proc name is passed to
|
||||
an ``untyped`` parameter:
|
||||
|
||||
.. code-block:: nim
|
||||
a.b # becomes `.`(a, "b")
|
||||
@@ -28,7 +28,7 @@ The matched dot operators can be symbols of any callable kind (procs,
|
||||
templates and macros), depending on the desired effect:
|
||||
|
||||
.. code-block:: nim
|
||||
proc `.` (js: PJsonNode, field: string): JSON = js[field]
|
||||
template `.` (js: PJsonNode, field: untyped): JSON = js[astToStr(field)]
|
||||
|
||||
var js = parseJson("{ x: 1, y: 2}")
|
||||
echo js.x # outputs 1
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user