Unpacking Software Livestream

Join our monthly Unpacking Software livestream to hear about the latest news, chat and opinion on packaging, software deployment and lifecycle management!

Learn More

Chocolatey Product Spotlight

Join the Chocolatey Team on our regular monthly stream where we put a spotlight on the most recent Chocolatey product releases. You'll have a chance to have your questions answered in a live Ask Me Anything format.

Learn More

Chocolatey Coding Livestream

Join us for the Chocolatey Coding Livestream, where members of our team dive into the heart of open source development by coding live on various Chocolatey projects. Tune in to witness real-time coding, ask questions, and gain insights into the world of package management. Don't miss this opportunity to engage with our team and contribute to the future of Chocolatey!

Learn More

Calling All Chocolatiers! Whipping Up Windows Automation with Chocolatey Central Management

Webinar from
Wednesday, 17 January 2024

We are delighted to announce the release of Chocolatey Central Management v0.12.0, featuring seamless Deployment Plan creation, time-saving duplications, insightful Group Details, an upgraded Dashboard, bug fixes, user interface polishing, and refined documentation. As an added bonus we'll have members of our Solutions Engineering team on-hand to dive into some interesting ways you can leverage the new features available!

Watch On-Demand
Chocolatey Community Coffee Break

Join the Chocolatey Team as we discuss all things Community, what we do, how you can get involved and answer your Chocolatey questions.

Watch The Replays
Chocolatey and Intune Overview

Webinar Replay from
Wednesday, 30 March 2022

At Chocolatey Software we strive for simple, and teaching others. Let us teach you just how simple it could be to keep your 3rd party applications updated across your devices, all with Intune!

Watch On-Demand
Chocolatey For Business. In Azure. In One Click.

Livestream from
Thursday, 9 June 2022

Join James and Josh to show you how you can get the Chocolatey For Business recommended infrastructure and workflow, created, in Azure, in around 20 minutes.

Watch On-Demand
The Future of Chocolatey CLI

Livestream from
Thursday, 04 August 2022

Join Paul and Gary to hear more about the plans for the Chocolatey CLI in the not so distant future. We'll talk about some cool new features, long term asks from Customers and Community and how you can get involved!

Watch On-Demand
Hacktoberfest Tuesdays 2022

Livestreams from
October 2022

For Hacktoberfest, Chocolatey ran a livestream every Tuesday! Re-watch Cory, James, Gary, and Rain as they share knowledge on how to contribute to open-source projects such as Chocolatey CLI.

Watch On-Demand

Downloads:

1,252

Downloads of v 1.2.0:

1,252

Last Update:

16 May 2016

Package Maintainer(s):

Software Author(s):

  • Amazon Web Services
  • DarwinJS

Tags:

Amazon AWS CloudWatch Monitoring Scripts PowerShell Windows

AWS Disk Monitoring Script For Windows (Install)

  • 1
  • 2
  • 3

1.2.0 | Updated: 16 May 2016

Downloads:

1,252

Downloads of v 1.2.0:

1,252

Maintainer(s):

Software Author(s):

  • Amazon Web Services
  • DarwinJS

AWS Disk Monitoring Script For Windows (Install) 1.2.0

  • 1
  • 2
  • 3

All Checks are Passing

3 Passing Tests


Validation Testing Passed


Verification Testing Passed

Details

Scan Testing Successful:

No detections found in any package files

Details
Learn More

Deployment Method: Individual Install, Upgrade, & Uninstall

To install AWS Disk Monitoring Script For Windows (Install), run the following command from the command line or from PowerShell:

>

To upgrade AWS Disk Monitoring Script For Windows (Install), run the following command from the command line or from PowerShell:

>

To uninstall AWS Disk Monitoring Script For Windows (Install), run the following command from the command line or from PowerShell:

>

Deployment Method:

NOTE

This applies to both open source and commercial editions of Chocolatey.

1. Enter Your Internal Repository Url

(this should look similar to https://community.chocolatey.org/api/v2/)


2. Setup Your Environment

1. Ensure you are set for organizational deployment

Please see the organizational deployment guide

2. Get the package into your environment

  • Open Source or Commercial:
    • Proxy Repository - Create a proxy nuget repository on Nexus, Artifactory Pro, or a proxy Chocolatey repository on ProGet. Point your upstream to https://community.chocolatey.org/api/v2/. Packages cache on first access automatically. Make sure your choco clients are using your proxy repository as a source and NOT the default community repository. See source command for more information.
    • You can also just download the package and push it to a repository Download

3. Copy Your Script

choco upgrade aws-monitor-diskusage -y --source="'INTERNAL REPO URL'" [other options]

See options you can pass to upgrade.

See best practices for scripting.

Add this to a PowerShell script or use a Batch script with tools and in places where you are calling directly to Chocolatey. If you are integrating, keep in mind enhanced exit codes.

If you do use a PowerShell script, use the following to ensure bad exit codes are shown as failures:


choco upgrade aws-monitor-diskusage -y --source="'INTERNAL REPO URL'" 
$exitCode = $LASTEXITCODE

Write-Verbose "Exit code was $exitCode"
$validExitCodes = @(0, 1605, 1614, 1641, 3010)
if ($validExitCodes -contains $exitCode) {
  Exit 0
}

Exit $exitCode

- name: Install aws-monitor-diskusage
  win_chocolatey:
    name: aws-monitor-diskusage
    version: '1.2.0'
    source: INTERNAL REPO URL
    state: present

See docs at https://docs.ansible.com/ansible/latest/modules/win_chocolatey_module.html.


chocolatey_package 'aws-monitor-diskusage' do
  action    :install
  source   'INTERNAL REPO URL'
  version  '1.2.0'
end

See docs at https://docs.chef.io/resource_chocolatey_package.html.


cChocoPackageInstaller aws-monitor-diskusage
{
    Name     = "aws-monitor-diskusage"
    Version  = "1.2.0"
    Source   = "INTERNAL REPO URL"
}

Requires cChoco DSC Resource. See docs at https://github.com/chocolatey/cChoco.


package { 'aws-monitor-diskusage':
  ensure   => '1.2.0',
  provider => 'chocolatey',
  source   => 'INTERNAL REPO URL',
}

Requires Puppet Chocolatey Provider module. See docs at https://forge.puppet.com/puppetlabs/chocolatey.


4. If applicable - Chocolatey configuration/installation

See infrastructure management matrix for Chocolatey configuration elements and examples.

Package Approved

This package was approved by moderator ferventcoder on 21 May 2016.

Description

ATTENTION: This is a licensing compliant, modified, derative work, please see "UPDATE LOG" in the script comments for the changes made.
Old locations that mentioned the original script:
http://aws.amazon.com/code/7932034889155460 (original URL now broken)
https://blog.appliedis.com/2012/10/05/how-to-publish-custom-performance-metrics-to-amazons-cloudwatch/
https://aws.amazon.com/blogs/aws/amazon-cloudwatch-monitoring-scripts-for-microsoft-windows/

2016-05-16 DarwinJS. Version 1.2.0
- Corrected script help - was an unedited paste of mon-put-metrics-mem
- Added help examples for new functionality.
- Fixed to find .NET 3.5 with newer installs of AWS SDK on Amazon AMIs.
- allows -disk_drive 'all' to simple upload stats on all local disks - whatever they are for that instance.
Will also dynamically adjust if disks are added to or removed from instance in the future.
- drops any non-existent disks from the list given in -disk_drive, rather than generating an error.
- removed assumption of credentials being provided so that code can rely on the much better practice of using instance roles.
- replaced all "write-host" lines with better practice "write-output".
- updated parameters and defaults so that if the script is used with no parameters it reports disk utilization for
all installed disks and relies on instance roles for permission to post to cloudwatch.
- switch -selfschedulewiththeseparams schedules the script in the task scheduler instead of running. Uses all parameters
given on the script call in the scheduled task (except, of course the parameter "-selfschedulewiththeseparams" itself)


tools\aws-monitor-diskusage\awscreds.conf
AWSSecretKey=
AWSAccessKeyId=
tools\aws-monitor-diskusage\LICENSE.txt
Apache License
Version 2.0, January 2004

TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION

1. Definitions.

"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.

"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.

"Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.

"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.

"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.

"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.

"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).

"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.

"Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution."

"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.

2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.

3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.

4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:

    You must give any other recipients of the Work or Derivative Works a copy of this License; and

    You must cause any modified files to carry prominent notices stating that You changed the files; and

    You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and

    If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License.

You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.

5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.

6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.

7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.

8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.

9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.

END OF TERMS AND CONDITIONS
tools\aws-monitor-diskusage\mon-put-metrics-disk-darwinjs.ps1
<#

  Copyright 2012-2012 Amazon.com, Inc. or its affiliates. All Rights Reserved.

    Licensed under the Apache License, Version 2.0 (the "License"). You may not use this file except in compliance with the License. A copy of the License is located at

          http://aws.amazon.com/apache2.0/

    or in the "license" file accompanying this file. This file is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

ATTENTION: This is a licensing compliant, modified, derative work, please see "UPDATE LOG" later in these comments for the changes made.

.SYNOPSIS
Collects disk utilization (%), available disk space and used disk space on an Amazon Windows EC2 instance and sends this data as custom metrics to Amazon CloudWatch.

.DESCRIPTION
This script is used to send custom metrics to Amazon Cloudwatch. This script pushes disk utilization (%), available disk space and used disk space to cloudwatch.
This script can be scheduled or run from a powershell prompt.
When launched from scheduler you can optionally specify logfile and all messages will be logged to logfile. You can use whatif and verbose mode with this script.

.PARAMETER disk_util
		Reports disk utilization in percentages.
.PARAMETER disk_used
		Reports disk space used.
.PARAMETER disk_avail
		Reports available disk space.
.PARAMETER disk_space_units
		Specifies units for disk metrics.
.PARAMETER disk_drive
		List of disk specifiers for which to report space.  Special value 'all' reports for all local disks.
.PARAMETER from_scheduler
		Specifies that this script is running from Task Scheduler.
.PARAMETER aws_access_id
		Specifies the AWS access key ID to use to identify the caller.
.PARAMETER aws_secret_key
		Specifies the AWS secret key to use to sign the request.
.PARAMETER aws_credential_file
		Specifies the location of the file with AWS credentials. Uses "AWS_CREDENTIAL_FILE" Env variable as default.
.PARAMETER logfile
		Logs all error messages to a log file. This is required when from_scheduler is set.
.PARAMETER scheduletaskwiththeseparams
		Schedules this script to run in the Windows Scheduler.
.PARAMETER scheduletaskwiththeseparams
  	Unschedules a previously schedule job.

.EXAMPLE
    powershell.exe .\mon-put-metrics-disk-darwinjs.ps1  -EC2AccessKey ThisIsMyAccessKey -EC2SecretKey ThisIsMySecretKey -disk_space_util -disk_space_avail -disk_space_units kilobytes

    Encoded credentials are a bad idea.  This script works with Instance Roles as well (V1.2.0).

.EXAMPLE
	powershell.exe .\mon-put-metrics-disk-darwinjs.ps1  -aws_credential_file C:\awscreds.conf -disk_drive C:, d -disk_space_util -disk_space_used -disk_space_avail -disk_space_units gigabytes

    Encoded credentials are a bad idea.  This script works with Instance Roles as well (V1.2.0).

.EXAMPLE
	powershell.exe .\mon-put-metrics-disk-darwinjs.ps1  -aws_credential_file C:\awscreds.conf -disk_drive C:,D: -disk_space_util -disk_space_units gigabytes  -from_scheduler -logfile C:\mylogfile.log

    Encoded credentials are a bad idea.  This script works with Instance Roles as well (V1.2.0).

.EXAMPLE
	powershell.exe .\mon-put-metrics-disk-darwinjs.ps1

     V1.2.0: Posts *usage* metrics for all local drivers - whatever they are in gigabytes.  Will automatically handle differences in attached disks.

.EXAMPLE
	powershell.exe .\mon-put-metrics-disk-darwinjs.ps1  -disk_drive all -disk_space_util -disk_space_units gigabytes -from_scheduler -logfile C:\mylogfile.log

    V1.2.0: Posts metrics for all local drivers - whatever they are.  Will automatically handle differences in attached disks.

.EXAMPLE
	powershell.exe .\mon-put-metrics-disk-darwinjs.ps1 -scheduletaskwiththeseparams -disk_drive all -disk_space_util -disk_space_units gigabytes -from_scheduler -logfile C:\mylogfile.log

    V1.2.0: Schedules a job every 5 minutes to post disk utilization for all local drives every 5 minutes.  Does not run the metrics when scheduling the job (but the scheduled job does post them).
.EXAMPLE
  powershell.exe .\mon-put-metrics-disk-darwinjs.ps1 -unschedule

    V1.2.0: Unschedules a previously scheduled job.

.NOTES
    PREREQUISITES (should be available ahead of time for any Amazon provided AMIs):
    1) Download the SDK library from http://aws.amazon.com/sdkfornet/
    2) Obtain Secret and Access keys from https://aws-portal.amazon.com/gp/aws/developer/account/index.html?action=access-key

	API Reference:http://docs.amazonwebservices.com/AWSEC2/latest/APIReference/query-apis.html

UPDATE LOG:

2016-05-16 DarwinJS.  Version 1.2.0
   - Corrected script help - was an unedited paste of mon-put-metrics-mem
   - Added help examples for new functionality.
   - Fixed to find .NET 3.5 with newer installs of AWS SDK on Amazon AMIs.
   - allows -disk_drive 'all' to simple upload stats on all local disks - whatever they are for that instance.
     Will also dynamically adjust if disks are added to or removed from instance in the future.
   - drops any non-existent disks from the list given in -disk_drive, rather than generating an error.
   - removed assumption of credentials being provided so that code can rely on the much better practice of using instance roles.
   - replaced all "write-host" lines with better practice "write-output".
   - updated parameters and defaults so that if the script is used with no parameters it reports disk utilization for
     all installed disks and relies on instance roles for permission to post to cloudwatch.
   - switch -selfschedulewiththeseparams schedules the script in the task scheduler instead of running.  Uses all parameters
     given on the script call in the scheduled task (except, of course the parameter "-selfschedulewiththeseparams" itself)

#>

[CmdletBinding(DefaultParametersetName="credsfromfile", supportsshouldprocess = $true) ]
param(
[switch]$selfschedulewiththeseparams,
[switch]$unschedule,
[switch]$disk_space_util=$True, #Set -disk_space_util:$False to disable from command line
[switch]$disk_space_used ,
[switch]$disk_space_avail ,
[string[]]$disk_drive='all',
[ValidateSet("bytes","kilobytes","megabytes","gigabytes" )]
[string]$disk_space_units = "gigabytes",
[switch]$from_scheduler,
[Parameter(Parametersetname ="credsinline")]
[string]$aws_access_id = "",
[Parameter(Parametersetname ="credsfromfile")]
[string]$aws_credential_file = [Environment]::GetEnvironmentVariable("AWS_CREDENTIAL_FILE"),
[Parameter(Parametersetname ="credsinline")]
[string]$aws_secret_key = "",
[string]$logfile,
[Switch]$version
)

$ErrorActionPreference = 'Stop'

### Initliaze common variables ###
$accountinfo = New-Object psobject
$wc = New-Object Net.WebClient
$time = Get-Date
[string]$aaid =""
[string]$ask =""
$invoc = (Get-Variable myinvocation -Scope 0).value
$currdirectory = Split-Path $invoc.mycommand.path
$scriptname = $invoc.mycommand.Name
$ver = '1.2.0'
$client_name = 'CloudWatch-PutInstanceDataWindows'
$useragent = "$client_name/$ver"

$TaskName = [io.path]::GetFileNameWithoutExtension($MyInvocation.MyCommand.Path)
$ScriptPath = $MyInvocation.MyCommand.Path

If ($unschedule)
{
  If (Test-Path "$env:windir\System32\Tasks\$TaskName")
  {
    schtasks /delete /tn "$TaskName" /F
  }
  Else
  {
    Write-Ouput "Did not find a task called `"$Taskname`", nothing to do..."
  }
  Return
}

If ($selfschedulewiththeseparams)
{
  Write-output "Scheduling myself to run in task manager (not running the actual script at this time)"
  #collect all parameters except selfschedulewiththeseparams
  #$argList += $MyInvocation.BoundParameters.GetEnumerator() | where {$_.key -ine 'selfschedulewiththeseparams'} | foreach {"-$($_.Key)$((. { switch ($($_.Value)) {"true" { ':$True' } "false" { ':$False' } default { " $($_.Value)" } }}))"}
  $argList += $MyInvocation.BoundParameters.GetEnumerator() | where {$_.key -ine 'selfschedulewiththeseparams'} | foreach {$curarg = $_ ;"$(. { switch ($($curarg.Value)) {'true' { "-$($curarg.Key)" } 'false' { '' } default { "-$($curarg.Key) $($curarg.Value)" } }})"}

  $Frequency = 'MINUTE'
  $minutes = 5

  If (Test-Path "$env:windir\System32\Tasks\$TaskName")
  {
    schtasks /delete /tn "$TaskName" /F
  }

  $argumentstring = "/create /sc `"$Frequency`" /mo $minutes /tn `"$TaskName`" /ru `"NT AUTHORITY\SYSTEM`" /tr `"powershell.exe -executionpolicy bypass -noprofile -file $ScriptPath $argList`" /F"

  Write-output "Scheduling with command: "
  Write-Output "$argumentstring"

  Start-process "schtasks.exe" -ArgumentList "$argumentstring" -nonewwindow -wait

  Write-Output "Schedule was created with command line: $argumentstring"

  Return

}

### Logs all messages to file or prints to console based on from_scheduler setting. ###
function report_message ([string]$message)
{
	if($from_scheduler)
	{	if ($logfile.Length -eq 0)
		{
			$logfile = $currdirectory +"\" +$scriptname.replace('.ps1','.log')
		}
		$message | Out-File -Append -FilePath $logfile
	}
	else
	{
		write-output $message
	}
}

### Global trap for all excpetions for this script. All exceptions will exit the script.###
trap [Exception] {
report_message ($_.Exception.Message)
Exit
}
if ($version)
{
 report_message "$scriptname version $ver"
 exit
}

####Test and load AWS sdk ###
$ProgFilesLoc = (${env:ProgramFiles(x86)}, ${env:ProgramFiles} -ne $null)[0]
$SDKLoc = "$ProgFilesLoc\AWS SDK for .NET\past-releases\version-2\Net35"

if ((Test-Path -PathType Container -Path $SDKLoc) -eq $false) {
    $SDKLoc = "C:\Windows\Assembly"
}

$SDKLibraryLocation = dir $SDKLoc -Recurse -Filter "AWSSDK.dll"
if ($SDKLibraryLocation -eq $null)
{
	throw "Please Install .NET sdk for this script to work."
}
else
{
	$SDKLibraryLocation = $SDKLibraryLocation.FullName
	Add-Type -Path $SDKLibraryLocation
	Write-Verbose "Assembly Loaded"
}

### Process parameterset for credentials and adds them to a powershell object ###
switch ($PSCmdlet.Parametersetname)
{
	"credsinline" {
					Write-Verbose "Using credentials passed as arguments"

					if (!($aws_access_id.Length -eq 0 ))
						{
							$aaid = $aws_access_id
						}
					else
						{
							throw ("Value of AWS access key id is not specified.")
						}

						if (!($aws_secret_key.Length -eq 0 ))
							{
								$ask = $aws_secret_key
                                $usingEncodedCredentials_VERYBAD = $true
							}
						else
							{
								throw "Value of AWS secret key is not specified."
							}
					}
	"credsfromfile"{
					if ( (test-path variable:aws_credential_file) -AND ($aws_credential_file) -AND (Test-Path $aws_credential_file))
						{
							Write-Verbose "Using AWS credentials file $aws_credential_file"
							Get-Content $aws_credential_file | ForEach-Object {
															if($_ -match '.*=.*'){$text = $_.split("=");
															switch ($text[0].trim())
															{
																"AWSAccessKeyId" 	{$aaid= $text[1].trim()}
																"AWSSecretKey" 		{ $ask = $text[1].trim()}
															}}}
                        $usingEncodedCredentials_VERYBAD = $true
						}
						else {write-output "Not configured to use aws_credential_file"}
					}
     default {
              #no credentials provided - must be using an instance role to get access
              }
}
if (($aaid.length -gt 0) -or ($ask.length -gt 0))
{

	Add-Member -membertype noteproperty -inputobject $accountinfo -name "AWSSecretKey" -value $ask
	Add-Member -membertype noteproperty -inputobject $accountinfo -name "AWSAccessKeyId" -value $aaid
	Remove-Variable ask; Remove-Variable aaid
}

### Check if atleast one metric is requested to report.###
if ( !$disk_space_avail -and !$disk_space_used -and !$disk_space_util )
{
	throw "Please specify a metric to report exiting script"
}

### Avoid a storm of calls at the beginning of a minute.###
if ($from_scheduler)
{
	$rand = new-object system.random
	start-sleep -Seconds $rand.Next(20)
}

### Functions that interact with metadata to get data required for dimenstion calculation and endpoint for cloudwatch api. ###
function get-metadata {
	$extendurl = $args
	$baseurl = "http://169.254.169.254/latest/meta-data"
	$fullurl = $baseurl + $extendurl
	return ($wc.DownloadString($fullurl))
}
function get-region {
	$az = get-metadata ("/placement/availability-zone")
	return ($az.Substring(0, ($az.Length -1)))
}
function get-endpoint {
	$region = get-region
	return "https://monitoring." + $region + ".amazonaws.com/"
}

### Function that creates metric data which will be added to metric list that will be finally pushed to cloudwatch. ###
function append_metric   {

	$metricdata = New-Object Amazon.Cloudwatch.Model.MetricDatum
	$dimensions = New-Object Collections.Generic.List[Amazon.Cloudwatch.Model.Dimension]
	$metricdata.metricname, $metricdata.Unit, $metricdata.value, $dimensions  = $args
	$metricdata.Dimensions = $dimensions
	$metricdata.Timestamp = $time.ToUniversalTime()
	return $metricdata
}

### Function that validates units passed. Default value of  Gigabytes is used###
function parse-units {
	param ([string]$disk_units,
		[long]$disk_unit_div)
	$units = New-Object psobject
	switch ($disk_space_units.ToLower())
					{
						"bytes" 	{ 	$disk_units = "Bytes" ; $disk_unit_div = 1}
						"kilobytes" { 	$disk_units = "Kilobytes" ;$disk_unit_div = 1kb}
						"megabytes" { 	$disk_units = "Megabytes" ;$disk_unit_div = 1mb}
						"gigabytes" {	$disk_units = "Gigabytes" ;$disk_unit_div = 1gb}
						default 	{ 	$disk_units = "Gigabytes" ; $disk_unit_div = 1gb }
					}
	Add-Member -MemberType NoteProperty -InputObject $units -Name "disk_units" -Value $disk_units
	Add-Member -MemberType NoteProperty -InputObject $units -Name "disk_unit_div" -Value $disk_unit_div
	return $units
}

### Verifes the array of drive letters passed###
function check-disks {
	$drive_list_parsed = @() #New-Object System.Collections.ArrayList

    $LocalDrivesOnThisMachine = @(Get-WMIObject Win32_LogicalDisk -filter "DriveType=3" | select -expand deviceid)

    If ($disk_drive -icontains 'all')
    {
      $drive_list_parsed = $LocalDrivesOnThisMachine
      Write-Output "'all' option found in the drives list, including all local drives: $LocalDrivesOnThisMachine, ('all' overrides any specific drives you may have also specified)"
    }
    Else
    {

      $INVALIDValuesPresent = @(($disk_drive | select-string -pattern $LocalDrivesOnThisMachine -simplematch -notmatch).line)
      $VALIDValuesPresent = @(($LocalDrivesOnThisMachine | select-string -pattern $INVALIDValuesPresent -simplematch -notmatch).line)

      $drive_list_parsed = $VALIDValuesPresent

      If ($INVALIDValuesPresent.count -gt 0)
      {
        Write-Output "Opps, these drives: $INVALIDValuesPresent are not present, only including drives that exist locally - these ones: $VALIDValuesPresent"
      }


	}
	return $drive_list_parsed
}

### Function that gets disk stats using WMI###
function get-diskmetrics {

	begin{}
	process {
			$drive_list_parsed = New-Object System.Collections.ArrayList
			$drive_list_parsed = check-disks
			$disksinfo = Get-WMIObject Win32_LogicalDisk -filter "DriveType=3"

			foreach ($diskinfo in $disksinfo){
				foreach ($drivelist in $drive_list_parsed){
					if ($diskinfo.DeviceID -eq $drivelist){
					$diskobj = New-Object psobject
					add-member -InputObject $diskobj -MemberType NoteProperty -Name "deviceid" -Value $diskinfo.DeviceID
					add-member -InputObject $diskobj -MemberType NoteProperty -Name "Freespace" -Value $diskinfo.Freespace
					add-member -InputObject $diskobj -MemberType NoteProperty -Name "size" -Value $diskinfo.size
					Add-Member -InputObject $diskobj -MemberType NoteProperty -Name "UsedSpace" -Value ($diskinfo.size - $diskinfo.Freespace)
					Write-Output $diskobj
					}
					}
				}

}
	end{}
}

### Function that writes metrics to be piped to next fucntion to push to cloudwatch.###
function create-diskmetriclist {
		param ([parameter(valuefrompipeline =$true)] $diskobj)
		begin{
				$units = parse-units
				$dims = New-Object Amazon.Cloudwatch.Model.Dimension
				$dims.Name = "InstanceId"
				$dims.value = get-metadata("/instance-id")
			}
		process{
			$dimlist = New-Object Collections.Generic.List[Amazon.Cloudwatch.Model.Dimension]
			$dimlist.Add($dims)
			$dim_drive_letter = New-Object Amazon.Cloudwatch.Model.Dimension
			$dim_drive_letter.Name = "Drive-Letter"
			$dim_drive_letter.value = $diskobj.Deviceid
			$dimlist.Add($dim_drive_letter)
			if ($disk_space_util)
						{
							$percent_disk_util= 0
							if ( [long]$diskobj.size -gt 0 ) { $percent_disk_util = 100 * ([long]$diskobj.UsedSpace/[long]$diskobj.size)}
							write (append_metric "VolumeUtilization" "Percent"  ("{0:N2}" -f $percent_disk_util) $dimlist)
						}
			if ($disk_space_used)
						{
							write (append_metric "VolumeUsed" $units.disk_units ("{0:N2}" -f ([long]($diskobj.UsedSpace/$units.disk_unit_div))) $dimlist)
						}
 			if ( $disk_space_avail)
						{
							write (append_metric "VolumeAvailable" $units.disk_units ("{0:N2}" -f([long]($diskobj.Freespace/$units.disk_unit_div))) $dimlist)
						}
			}
		end{}
}

 ### Uses AWS sdk to push metrics to cloudwatch. This finally prints a requestid.###
function put-instancemem {
 param ([parameter(Valuefrompipeline=$true)] $metlist)
 begin{
 		$cwconfig = New-Object Amazon.CloudWatch.AmazonCloudWatchConfig
		$cwconfig.serviceURL = get-endpoint
		$cwconfig.UserAgent = $useragent
		$monputrequest  = new-object Amazon.Cloudwatch.Model.PutMetricDataRequest
		$monputrequest.namespace = "System/Windows"
		$response = New-Object psobject
		$metricdatalist = New-Object Collections.Generic.List[Amazon.Cloudwatch.Model.MetricDatum]

        If ($usingEncodedCredentials_VERYBAD)
        {
          $cwclient = New-Object Amazon.Cloudwatch.AmazonCloudWatchClient($accountinfo.AWSAccessKeyId,$accountinfo.AWSSecretKey,$cwconfig)
        }
        Else
        {
          $cwclient = New-Object Amazon.Cloudwatch.AmazonCloudWatchClient($cwconfig)
        }

	}
 process{
  			if ($PSCmdlet.shouldprocess($metlist.metricname,"The metric data "+$metlist.value.tostring() +" "+ $metlist.unit.tostring()+" will be pushed to cloudwatch")){
				$metricdatalist.add($metlist)
				Write-Verbose ("Metricname= " +$metlist.metricname+" Metric Value= "+ $metlist.value.tostring()+" Metric Units= "+$metlist.unit.tostring())
			}
 		}
 end{
 			if ($metricdatalist.count -gt 0) {
 				$monputrequest.metricdata = $metricdatalist
				$monresp =  $cwclient.PutMetricData($monputrequest)
				Add-Member -Name "RequestId" -MemberType NoteProperty -Value $monresp.ResponseMetadata.RequestId -InputObject $response -Force
				}
				else {throw "No metric data to push to CloudWatch exiting script" }
				Write-Verbose ("RequestID: " +  $response.RequestId)
 	}
 }
 ### Pipelined call of fucntions that pushs metrics to cloudwatch.
get-diskmetrics | create-diskmetriclist | put-instancemem
tools\aws-monitor-diskusage\NOTICE.txt
Amazon Cloudwatch Monitoring Scripts for Windows 
Copyright 2011-2012 Amazon.com, Inc. or its affiliates. All Rights Reserved.
tools\chocolateyinstall.ps1

$ErrorActionPreference = 'Stop';

$packageName= 'aws-monitor-diskusage'
$toolsDir   = "$(Split-Path -parent $MyInvocation.MyCommand.Definition)"

Start-ChocolateyProcessAsAdmin "-noprofile -file `"$toolsDir\aws-monitor-diskusage\mon-put-metrics-disk-darwinjs.ps1`" -selfschedulewiththeseparams -disk_drive all -disk_space_util -disk_space_units gigabytes" "powershell.exe"
tools\chocolateyuninstall.ps1

$ErrorActionPreference = 'Stop';

$packageName= 'aws-monitor-diskusage'
$toolsDir   = "$(Split-Path -parent $MyInvocation.MyCommand.Definition)"

Start-ChocolateyProcessAsAdmin "-noprofile -file `"$toolsDir\aws-monitor-diskusage\mon-put-metrics-disk-darwinjs.ps1`" -unschedule" "powershell.exe"

Log in or click on link to see number of positives.

In cases where actual malware is found, the packages are subject to removal. Software sometimes has false positives. Moderators do not necessarily validate the safety of the underlying software, only that a package retrieves software from the official distribution point and/or validate embedded software against official distribution point (where distribution rights allow redistribution).

Chocolatey Pro provides runtime protection from possible malware.

Add to Builder Version Downloads Last Updated Status

This package has no dependencies.

Discussion for the AWS Disk Monitoring Script For Windows (Install) Package

Ground Rules:

  • This discussion is only about AWS Disk Monitoring Script For Windows (Install) and the AWS Disk Monitoring Script For Windows (Install) package. If you have feedback for Chocolatey, please contact the Google Group.
  • This discussion will carry over multiple versions. If you have a comment about a particular version, please note that in your comments.
  • The maintainers of this Chocolatey Package will be notified about new comments that are posted to this Disqus thread, however, it is NOT a guarantee that you will get a response. If you do not hear back from the maintainers after posting a message below, please follow up by using the link on the left side of this page or follow this link to contact maintainers. If you still hear nothing back, please follow the package triage process.
  • Tell us what you love about the package or AWS Disk Monitoring Script For Windows (Install), or tell us what needs improvement.
  • Share your experiences with the package, or extra configuration or gotchas that you've found.
  • If you use a url, the comment will be flagged for moderation until you've been whitelisted. Disqus moderated comments are approved on a weekly schedule if not sooner. It could take between 1-5 days for your comment to show up.
comments powered by Disqus