Predicted values and intervals based on a fitted model object.
Usage
# S3 method for ssn_lm
predict(
object,
newdata,
se.fit = FALSE,
interval = c("none", "confidence", "prediction"),
level = 0.95,
block = FALSE,
...
)
# S3 method for ssn_glm
predict(
object,
newdata,
type = c("link", "response"),
se.fit = FALSE,
interval = c("none", "confidence", "prediction"),
newdata_size,
level = 0.95,
var_correct = TRUE,
...
)
Arguments
- object
- newdata
A character vector that indicates the name of the prediction data set in the SSN object for which predictions are desired. If omitted, predictions for all prediction data sets are returned. Note that the name
".missing"
indicates the prediction data set that contains the missing observations in the data used to fit the model.- se.fit
A logical indicating if standard errors are returned. The default is
FALSE
.- interval
Type of interval calculation. The default is
"none"
. Other options are"confidence"
(for confidence intervals) and"prediction"
(for prediction intervals).- level
Tolerance/confidence level. The default is
0.95
.- block
A logical indicating whether a block prediction over the entire region in
newdata
should be returned. The default isFALSE
, which returns point predictions for each location innewdata
. Currently only available for model fit usingssn_lm()
or models fit usingssn_glm()
wherefamily
is"gaussian"
.- ...
Other arguments. Not used (needed for generic consistency).
- type
The scale (
response
orlink
) of predictions obtained usingssn_glm
objects.- newdata_size
The
size
value for each observation innewdata
used when predicting for the binomial family.- var_correct
A logical indicating whether to return the corrected prediction variances when predicting via models fit using
ssn_glm
. The default isTRUE
.
Value
If se.fit
is FALSE
, predict.ssn()
returns
a vector of predictions or a matrix of predictions with column names
fit
, lwr
, and upr
if interval
is "confidence"
or "prediction"
. If se.fit
is TRUE
, a list with the following components is returned:
fit
: vector or matrix as abovese.fit:
standard error of each fit
Details
The (empirical) best linear unbiased predictions (i.e., Kriging
predictions) at each site are returned when interval
is "none"
or "prediction"
alongside standard errors. Prediction intervals
are also returned if interval
is "prediction"
. When
interval
is "confidence"
, the estimated mean is returned
alongside standard errors and confidence intervals for the mean.
Examples
# Copy the mf04p .ssn data to a local directory and read it into R
# When modeling with your .ssn object, you will load it using the relevant
# path to the .ssn data on your machine
copy_lsn_to_temp()
temp_path <- paste0(tempdir(), "/MiddleFork04.ssn")
mf04p <- ssn_import(temp_path, predpts = "CapeHorn", overwrite = TRUE)
ssn_mod <- ssn_lm(
formula = Summer_mn ~ ELEV_DEM,
ssn.object = mf04p,
tailup_type = "exponential",
additive = "afvArea"
)
predict(ssn_mod, "CapeHorn")
#> 1 2 3 4 5 6 7 8
#> 9.984935 9.985021 9.985107 9.985193 9.985279 9.985365 9.927806 9.927892
#> 9 10 11 12 13 14 15 16
#> 9.927978 9.985708 9.985794 9.985880 9.985966 9.986052 9.986138 10.015046
#> 17 18 19 20 21 22 23 24
#> 10.015132 10.015217 10.015303 10.015389 10.015475 10.044383 10.044469 10.044555
#> 25 26 27 28 29 30 31 32
#> 10.044641 10.044727 10.044813 10.044899 10.044984 10.045070 10.045156 10.045242
#> 33 34 35 36 37 38 39 40
#> 10.016506 10.016592 9.987855 9.987941 9.988027 10.016935 10.017021 10.017107
#> 41 42 43 44 45 46 47 48
#> 10.017193 10.017279 10.017364 10.017450 10.046358 10.046444 10.046530 10.046616
#> 49 50 51 52 53 54 55 56
#> 10.046702 10.046788 10.046874 10.018137 10.018223 10.018309 10.018395 10.018481
#> 57 58 59 60 61 62 63 64
#> 10.047389 10.047475 10.047389 10.047262 10.047135 10.047008 10.046881 10.046755
#> 65 66 67 68 69 70 71 72
#> 10.046628 10.046501 10.046374 10.046247 10.017298 10.017171 10.017044 10.016917
#> 73 74 75 76 77 78 79 80
#> 10.016790 10.016663 10.016537 10.016410 10.016283 10.016156 10.016029 10.015902
#> 81 82 83 84 85 86 87 88
#> 9.986953 9.986826 9.986699 9.986572 9.986445 9.986319 9.986192 9.986065
#> 89 90 91 92 93 94 95 96
#> 9.985938 10.014633 10.014506 10.014379 10.071897 10.071770 10.071643 10.100339
#> 97 98 99 100 101 102 103 104
#> 10.100212 10.100054 10.215128 10.214914 10.214699 10.243307 10.243092 10.242878
#> 105 106 107 108 109 110 111 112
#> 10.271486 10.271271 10.271057 10.270842 10.270627 10.270413 10.270198 10.269984
#> 113 114 115 116 117 118 119 120
#> 10.269769 10.269555 10.240518 10.240303 10.240089 10.297519 10.297304 10.239445
#> 121 122 123 124 125 126 127 128
#> 10.037475 10.037260 10.296446 10.296231 10.296017 10.065224 10.295588 10.295373
#> 129 130 131 132 133 134 135 136
#> 10.295159 10.150833 10.294730 10.294515 10.294301 10.149975 10.293871 10.293657
#> 137 138 139 140 141 142 143 144
#> 10.293442 10.293228 10.321835 10.321621 10.321406 10.321192 10.320977 10.320763
#> 145 146 147 148 149 150 151 152
#> 10.320548 10.320334 10.348941 10.348727 10.348512 10.319475 10.319261 10.347869
#> 153 154 155 156 157 158 159 160
#> 10.347654 10.376262 10.376047 10.375833 10.375618 10.375403 10.346367 10.374974
#> 161 162 163 164 165 166 167 168
#> 10.374760 10.374545 10.374331 10.374116 10.373902 10.402509 10.402295 10.402080
#> 169 170 171 172 173 174 175 176
#> 10.373043 10.372829 10.372614 10.401222 10.401007 10.400793 10.429400 10.429186
#> 177 178 179 180 181 182 183 184
#> 10.428971 10.428757 10.428542 10.428328 10.428113 10.427899 10.427684 10.427469
#> 185 186 187 188 189 190 191 192
#> 10.456077 10.455863 10.455648 10.455433 10.455219 10.455004 10.454790 10.454575
#> 193 194 195 196 197 198 199 200
#> 10.454361 10.454146 10.453931 10.453717 10.453502 10.453288 10.481895 10.481681
#> 201 202 203 204 205 206 207 208
#> 10.481532 10.481469 10.481406 10.481342 10.481279 10.481216 10.481153 10.481089
#> 209 210 211 212 213 214 215 216
#> 10.481026 10.480963 10.509722 10.509658 10.509595 10.509532 10.509469 10.509405
#> 217 218 219 220 221 222 223 224
#> 10.509342 10.538101 10.509215 10.509152 10.509089 10.509026 10.508962 10.508899
#> 225 226 227 228 229 230 231 232
#> 10.508836 10.508772 10.508709 10.508646 10.537405 10.537342 10.537278 10.537215
#> 233 234 235 236 237 238 239 240
#> 10.537152 10.537088 10.537025 10.565784 10.565721 10.565658 10.565594 10.565531
#> 241 242 243 244 245 246 247 248
#> 10.565468 10.565404 10.565341 10.565278 10.565214 10.565151 10.565088 10.565025
#> 249 250 251 252 253 254 255 256
#> 10.564961 10.564898 10.564835 10.564771 10.564708 10.564645 10.593404 10.593341
#> 257 258 259 260 261 262 263 264
#> 10.593277 10.593214 10.621973 10.621910 10.621846 10.621783 10.621720 10.621656
#> 265 266 267 268 269 270 271 272
#> 10.621593 10.621530 10.621467 10.621403 10.621340 10.621277 10.621213 10.649972
#> 273 274 275 276 277 278 279 280
#> 10.649909 10.649846 10.678605 10.678541 10.678478 10.678415 10.678352 10.678288
#> 281 282 283 284 285 286 287 288
#> 10.678225 10.678162 10.678098 10.678035 10.677972 10.677908 10.677845 10.677782
#> 289 290 291 292 293 294 295 296
#> 10.706541 10.706478 10.706414 10.706351 10.706288 10.706224 10.706161 10.734920
#> 297 298 299 300 301 302 303 304
#> 10.734870 10.734850 10.734830 10.734811 10.734791 10.734771 10.734752 10.734732
#> 305 306 307 308 309 310 311 312
#> 10.734712 10.734693 10.734673 10.734653 9.902581 9.902667 9.873931 9.902839
#> 313 314 315 316 317 318 319 320
#> 9.902925 9.903011 9.903097 9.903183 9.932091 9.932177 9.932263 9.932349
#> 321 322 323 324 325 326 327 328
#> 9.932435 9.847419 9.875979 9.875717 9.875455 9.846370 9.903752 9.903490
#> 329 330 331 332 333 334 335 336
#> 9.903228 9.902966 9.902703 9.873619 9.902179 9.901916 9.844010 9.843747
#> 337 338 339 340 341 342 343 344
#> 9.843485 9.900867 9.900605 9.929165 9.928903 9.928699 9.899963 9.900049
#> 345 346 347 348 349 350 351 352
#> 9.900135 9.784932 9.986774 9.986860 9.986946 9.900565 10.015940 10.016026
#> 353 354 355 356 357 358 359 360
#> 10.016112 10.016198 9.987461 9.987547 9.987633 9.987719 9.987805 9.987891
#> 361 362 363 364 365 366 367 368
#> 9.709357 9.709407 9.709458 9.709508 9.709558 9.709609 9.738481 9.738532
#> 369 370 371 372 373 374 375 376
#> 9.738582 9.767455 9.767505 9.767556 9.767606 9.796479 9.741375 9.741259
#> 377 378 379 380 381 382 383 384
#> 9.741143 9.741026 9.740910 9.740793 9.740677 9.740560 9.740444 9.740327
#> 385 386 387 388 389 390 391 392
#> 9.740211 9.740094 9.711156 9.711039 9.710923 9.710806 9.710690 9.710573
#> 393 394 395 396 397 398 399 400
#> 9.710457 9.710340 9.710079 9.709817 9.680732 9.680470 9.680208 9.679946
#> 401 402 403 404 405 406 407 408
#> 9.679684 9.708244 9.707981 9.707719 9.707457 9.707195 9.706933 9.706671
#> 409 410 411 412 413 414 415 416
#> 9.735231 9.706146 9.705884 9.705622 9.705360 9.705097 9.704835 9.704573
#> 417 418 419 420 421 422 423 424
#> 9.704311 9.704049 9.703786 9.674702 9.674440 9.674177 9.673915 9.673653
#> 425 426 427 428 429 430 431 432
#> 9.673391 9.673129 9.672866 9.672604 9.672342 9.672080 9.671818 9.700378
#> 433 434 435 436 437 438 439 440
#> 9.700116 9.699853 9.670769 9.670507 9.670244 9.669982 9.698542 9.698280
#> 441 442 443 444 445 446 447 448
#> 9.669196 9.668933 9.697493 9.697231 9.696969 9.696707 9.696445 9.696182
#> 449 450 451 452 453 454 455 456
#> 9.695920 9.695658 9.695396 9.608667 9.608405 9.608142 9.665525 9.665263
#> 457 458 459 460 461 462 463 464
#> 9.665000 9.635916 9.635654 9.642482 9.642533 9.503475 9.503517 9.503558
#> 465 466 467 468 469 470 471 472
#> 9.503600 9.503641 9.503683 9.635391 9.635129 9.663689 9.503724 9.532588
#> 473 474 475 476 477 478 479 480
#> 9.532629 9.532670 9.532712 9.532753 9.532795 9.504014 9.561700 9.561741
#> 481 482 483 484 485 486 487 488
#> 9.561783 9.533002 9.561865 9.561907 9.561948 9.533167 9.533209 9.533259
#> 489 490 491 492 493 494 495 496
#> 9.562133 9.562183 9.562234 9.562285 9.562336 9.562387 9.562438 9.562488
#> 497 498 499 500 501 502 503 504
#> 9.562539 9.562590 9.562641 9.562692 9.562743 9.562793 9.562844 9.562895
#> 505 506 507 508 509 510 511 512
#> 9.562946 9.562997 9.563047 9.563098 10.763456 10.763436 10.763417 10.763397
#> 513 514 515 516 517 518 519 520
#> 10.763377 10.763358 10.763338 10.763318 10.763299 10.792101 10.016838 9.988102
#> 521 522 523 524 525 526 527 528
#> 9.988188 9.988274 9.988360 9.959624 9.959710 9.959796 9.931059 9.931145
#> 529 530 531 532 533 534 535 536
#> 9.931231 9.902495 9.796529 9.796580 9.796630 9.796680 9.796731 9.796781
#> 537 538 539 540 541 542 543 544
#> 9.796832 9.796882 9.796932 9.768160 9.768211 9.768261 9.768312 9.768362
#> 545 546 547 548 549 550 551 552
#> 9.768412 9.768463 9.739691 9.739741 9.768614 9.739842 9.739893 9.739943
#> 553 554 555 556 557 558 559 560
#> 9.739993 9.711221 9.711272 9.740145 9.740195 9.711423 9.711473 9.711524
#> 561 562 563 564 565 566 567 568
#> 9.682752 9.711625 9.711675 9.711725 9.711776 9.711826 9.711877 9.711927
#> 569 570 571 572 573 574 575 576
#> 9.711977 9.740850 9.740900 9.740951 9.741001 9.741052 9.741102 9.769975
#> 577 578 579 580 581 582 583 584
#> 9.770025 9.770075 9.770126 9.741354 9.741404 9.741455 9.741505 9.712733
#> 585 586 587 588 589 590 591 592
#> 9.712784 9.712834 9.712884 9.684112 9.712985 9.713035 9.713086 9.713136
#> 593 594 595 596 597 598 599 600
#> 9.741841 9.741725 9.741608 9.741492 9.605783 9.605520 9.605258 9.662640
#> 601 602 603 604 605 606 607 608
#> 9.662378 9.662116 9.719498 9.719236 9.718974 9.747534 9.747272 9.747010
#> 609 610 611 612 613 614 615 616
#> 9.804392 9.804130 9.861512 9.861250 9.860987 9.918370 9.918107 9.582860
#> 617 618 619 620 621 622 623 624
#> 9.582910 9.582961 9.583012 9.611885 9.611936 9.611986 9.612037 9.612088
#> 625 626 627 628 629 630 631 632
#> 9.612138 9.612189 9.612240 9.612291 9.612327 9.612392 9.612443 9.612494
#> 633 634 635 636 637 638 639 640
#> 9.612544 9.612595 9.641468 9.641519 9.641569 9.612798 9.612849 9.641722
#> 641 642 643 644 645 646 647 648
#> 9.641772 9.641823 9.641874 9.641924 9.641975 9.642026 9.670899 9.670950
#> 649 650 651 652 653 654
#> 9.671000 9.671051 9.671102 9.642330 9.642381 9.642432