Posted on Leave a comment

3 Step to use External Interrupt on ESP8266

I was in the middle of replicating my 433 MHz remote control using Arduino. The receiver circuit is quite simple and it is using Arduino pin 2 (interrupt) to detect the data signal from the receiver.

I was wondering whether I can do the same thing with ESP8266. After a short google, I found some information about Interrupt pin D3 (GPIO0) and also D0 (GPIO16). At a glance it seems do able.

Here are the steps to use the interrupt.

  1. Initialise IO pin as input
  2. Initialise IO with Interrupt Subroutine definition
  3. Interrupt Subroutine.

Here are the code that I stumble upon from the Circuits4you.com website.

const int interruptPin = 0; //GPIO 0 (Flash Button) STEP 1
const int LED=2; //On board blue LED
 
void setup() {
  Serial.begin(115200);
  pinMode(LED,OUTPUT);
  pinMode(interruptPin, INPUT_PULLUP);
  attachInterrupt(digitalPinToInterrupt(interruptPin), handleInterrupt, CHANGE); // STEP 2
}
 
void loop()
{
    digitalWrite(LED,HIGH); //LED off
    delay(1000);
    digitalWrite(LED,LOW); //LED on
    delay(1000);
}
 
//This program get executed when interrupt is occurs i.e.change of input state - STEP 3
void handleInterrupt() {
    Serial.println("Interrupt Detected");
}

I will try this tonight and report back whether the Interrupt detection works or not.

The Remote.

BAUHN Remote Control

The remote consist of 4 ON button and 4 OFF buttons, each one is labelled A, B, C and D.It works independently controlling the remote power point switch.

The power point needs to be sync against one of the button pair (A, B, C or D). In this experiment I am detecting button B.

Below are the connection between ESP8266 to the 433 MHz receiver. As you can see I only connected 3 cables. 3.3V Vcc for Power (red cable), GND (green cable) and Data (Yellow) connected to GPIO0 of ESP8266 (D3).

I had used a slightly different sketch which is utilising the rc-switch library, and it also can provide the same detection that I had done using Arduino Uno. Here are the sketch that I had used to detect the remote.

#include <RCSwitch.h>
RCSwitch mySwitch = RCSwitch();

void setup() {
Serial.begin(9600);
mySwitch.enableReceive(0); // Receiver on interrupt 0 => that is pin GPIO0 or D3 in ESP8266

pinMode(LED_BUILTIN, OUTPUT);
}
void loop() {
if (mySwitch.available()) {
Serial.print("Received ");
output(mySwitch.getReceivedValue(), mySwitch.getReceivedBitlength(), mySwitch.getReceivedDelay(), mySwitch.getReceivedRawdata(),mySwitch.getReceivedProtocol());
mySwitch.resetAvailable();
}
}

I had connect the receiver to Pin D3 on ESP8266, and getting the following result when I press the B-ON button.

Received Decimal: 9085236 (24Bit) Binary: 100010101010000100110100 Tri-State: not applicable PulseLength: 500 microseconds Protocol: 5
Raw data: 7036,1061,455,589,934,607,918,613,952,1071,454,575,955,1065,457,565,971,1033,509,509,1021,1022,506,526,995,542,981,548,977,543,979,1055,482,561,1057,466,1080,942,548,980,540,486,1067,956,552,484,1071,564,966,
Received Decimal: 9085236 (24Bit) Binary: 100010101010000100110100 Tri-State: not applicable PulseLength: 500 microseconds Protocol: 5
Raw data: 6969,1116,407,629,907,553,987,554,928,1085,445,593,924,1123,405,623,912,1122,403,625,901,1049,489,537,988,546,974,556,965,565,970,1045,483,561,944,559,969,1070,453,1073,453,578,949,1066,460,577,946,545,1021,

So I can conclude that the Interrupt definitely working ok. The next step would be to capture a few more data and trying to send it back via the 433MHz transmitter to try to replicate each of the button. Given that the ESP8266 can be connected to WIFI and also can server as a webserver this will make it possible to control the remote via internet as an IoT device.

Stay tune for the next article. Please share or subscribe if you like to see more of this article and feel free to drop me a comment.

Posted on Leave a comment

Connecting IoT Sensors data to Node-RED

This is the continuation of the Temperature sensor project in the previous post. The concept is to allow the data from sensors (temperature, motion) can be displayed in Apple Homekit, so that the user can interact with the information and control the IoT connected devices (Lights, Fan, etc). It is best described in the following picture.

The following instruction shows how to install Node-RED on a linux computer running Debian OS.

sudo npm install -g --unsafe-perm node-red

You will also need to install the mosquitto MQTT message broker, here are the command required

sudo apt-key add mosquitto-repo.gpg.key
sudo wget http://repo.mosquitto.org/debian/mosquitto-jessie.list
sudo apt-get update
sudo apt-get install mosquitto

You will also need the latest Python library, so grab them using the following instruction

sudo apt-get install python-dev

Test the installation. In this example I was using Linux Debian, so typing the command node if you get the following in the command prompt, that means the installation is successful. So then you can exit node by typing .exit command in the prompt >.

If all goes well, you can run the node-red command in the command prompt. You should get the following message. This shows that the node-red is now running at http://127.0.0.1:1880.

Node-RED Settings

Node-RED setting file called settings.js, on Linux it is located in the /usr/lib/node-modules/node-red folder. You will have another settings.js file in the .node-red folder in your home folder. This setting will be loaded by default.

Creating the flow in Node-RED

Now that you have a running Node-RED, it is time to create the flow. In this example we will create a simple flow to read temperature posted by our ESP8266. Let’s start by firing up your favourite browser and point to the following URL: http://127.0.0.1:1880/

You will be presented with a blank screen similar to the following picture. Now to start creating a flow, drop an “Inject” node from input section. We will use this as a trigger to get the temperature reading. Once you dropped it in, double click to set the property. We call the node “timestamp” and we set the interval to repeat every 4 minutes.

The next step is to connect this with an “http” node, so drop an “http” node and configure this as http GET to call a server side script in the webserver. What the script needs to return is the temperature in JSON format as below:

{"CurrentTemperature":25}

So my data_store2.php script does exactly that, as shown in the following code:

 /* readtemperature file from temp.txt file    return the value back in JSON format for HomeKit  */ 
$theparam = $_GET;
$file = './temp.txt';
$temperature = file_get_contents($file);
echo '{"CurrentTemperature":'.$temperature.'}';

Now the final step is to connect to the “Homekit” node from the Advance nodes menu. Once you drop the “Homekit” node, you can double click to configure the property as below.

Once all had been connected, it is time to deploy the node. You can do this by clicking on the “Deploy” button at the top of the Node-RED window. You will need to click the “Deploy” button whenever you make any changes to the node. Sometime the deployment might stop the Node-RED server, so you just have to run the node-red command again in the command prompt.

If all goes well, you can now test this node by clicking on the button next to “timestamp” node, the temperature should be read from the webserver and displayed in Homekit, similar to the following picture.

That conclude this session on how to configure the Node-RED to work with our temperature sensor data from ESP8266. Please let me know if you have any questions related to this and don’t forget to subscribe for update on the similar projects. The next session we are going to connect this to the Apple homekit in IPhone or IPad.

Posted on Leave a comment

Download Microsoft Sharepoint List Attachments using Powershell script

I stumble across a problem when trying to download attachment from the Sharepoint List. The list have more than 50,000 rows, the problem with the big list is Windows explorer is not able to display the list in Explorer view, so there is a need to use the Powershell script to download all the attachment programmatically.

Step 1. Make sure you have the Sharepoint Client dll required as specified in the following code


[void][Reflection.Assembly]::LoadFrom("$env:CommonProgramFiles\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll")
[void][Reflection.Assembly]::LoadFrom("$env:CommonProgramFiles\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll")

Step 2. Define the sharepoint site and the list library

The sharepoint site and the library is defined using the following $webUrl variable and $library respectively, also don’t forget to specify the local folder where the files will be downloaded.

$webUrl = "http://website.com/sites/sharepointsite" 
$library = "SharepointLibrary"
Local Folder to dump files
$tempLocation = "C:\temp\"

Step 3. Define how many rows the CamlQuery should return on each iteration

The beauty of the PowerShell Script is that you can specify how many rows to return each time, so the script will not have a problem recursively going through a big list that are over 50,000 and being able to download all the attachment. The following code shows that we are limiting the query to return 3000 rows.

$camlQuery = New-Object Microsoft.SharePoint.Client.CamlQuery $camlQuery.ViewXml ="<View> <RowLimit>3000</RowLimit></View>"

Step 4. Define the folder structure to hold all the attachment to be downloaded

The following code is using the combination of Title and ID as the folder name to store the attachment. It also check whether the folder already exists prior to creating a new one.


    $folderName=$listItem["Title"]+"_"+$listItem["ID"]
    $destinationfolder = $tempLocation + "\"+ $folderName 

     #check if folder is exist or not, if not exist then create new
  if (!(Test-Path -path $destinationfolder))        
   {            
     $dest = New-Item $destinationfolder -type directory      
     Write-Host "Created Folder with Name:" $folderName    
   }

The following is the full code to download the list attachments and put them in the local folder. The credential used is the user credential where the script is executed, that means the login user will need to have access to the SharePoint Site and able to download the attachement.


[void][Reflection.Assembly]::LoadFrom("$env:CommonProgramFiles\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll")
[void][Reflection.Assembly]::LoadFrom("$env:CommonProgramFiles\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll")
Clear-Host
#$cred = Get-Credential "user@microsoft.com"
#$credentials = New-Object Microsoft.Sharepoint.Client.SharePointOnlineCredentials($cred.Username, $cred.Password)
$webUrl = "http://website.com/sites/sharepointsite"

$clientContext = New-Object Microsoft.Sharepoint.Client.ClientContext($webUrl)
Write-Host "Connecting To Site: " $webUrl   

 $username = "$env:USERDOMAIN\$env:USERNAME"

$library = "SharepointLibrary" 
#Local Folder to dump files
$tempLocation = "C:\temp\"    

$global:web = $clientContext.Web;
$global:site = $clientContext.Site;

$clientContext.Load($web)
$clientContext.Load($site)

$listRelItems = $clientContext.Web.Lists.GetByTitle($library)

$clientContext.Load($listRelItems)
$clientContext.ExecuteQuery();
Write-Host "list item count " $listRelItems.ItemCount

$camlQuery = New-Object Microsoft.SharePoint.Client.CamlQuery
$camlQuery.ViewXml =" 3000"
 $listCollection = New-Object System.Collections.Generic.List[string] 
 $count = 0
Do {
$allItems=$listRelItems.GetItems($camlQuery)
$clientContext.Load($allItems)
$clientContext.ExecuteQuery()
$camlQuery.ListItemCollectionPosition = $allItems.ListItemCollectionPosition
foreach ($listItem in $allItems)
 {
    $folderName=$listItem["Title"]+"_"+$listItem["ID"]
    $destinationfolder = $tempLocation + "\"+ $folderName 

     #check if folder is exist or not, if not exist then create new
  if (!(Test-Path -path $destinationfolder))        
   {            
     $dest = New-Item $destinationfolder -type directory      
     Write-Host "Created Folder with Name:" $folderName    
   }
    $clientContext.load($listItem)
    $clientContext.ExecuteQuery();

    $attach = $listItem.AttachmentFiles
    $clientContext.load($attach)
    $clientContext.ExecuteQuery();
    if($attach -ne $null){
        Write-Host "No of attachment:" $attach.Count
        foreach ($attachitem in $attach){
            Write-Host "Downloading Attachements started: "   $attachitem.FileName
            $attachpath = $webUrl + "/Lists/"+ $library + "/Attachments/" + $listItem["ID"] + "/" + $attachitem.FileName
            Write-Host "path: " $attachpath 
         
            $path = $destinationfolder + "\" + $attachitem.FileName
            Write-Host "Saving to the location:"  $path

            $siteUri = [Uri]$attachpath
            $client = new-object System.Net.WebClient
            $client.UseDefaultCredentials=$true
            
            try{
                  $client.DownloadFile($attachpath, $path)
                  $client.Dispose()
            } catch{
                write-error "Failed to download $url, $_ "
            }

        }
    }else {
     Write-Host   "For above current item don't have any attachments" 
    }
  }
Write-Host " List item" $count
$count++
} while ($camlQuery.ListItemCollectionPosition -ne $null)
     Write-Host   "Script execution done !" 

Please let me know if the above script is of useful to you and don’t forget to share or subscribe for more frequent update to the similar topic. You can also drop me a line or questions if you have any.

Posted on Leave a comment

4 Steps to download Microsoft Sharepoint Document Library recursively

I stumble across this problem when we try to decommissioning Microsoft Sharepoint. We had a huge document library and it is not possible to copy them from explorer view, so the solution is to use PowerShell script to do this automagically.

Step 1. Define the DLL that is required.

This is done through the following code snippets. It is crucial to have the 2 DLL to allow the copy function to work. The script will use the credential of the user login into the machine and executing the script. This removes the complexity having to enter the sharepoint credential into the script.

# Load the SharePoint 2013 .NET Framework Client Object Model libraries. # 
[void][Reflection.Assembly]::LoadFrom("c:\Microsoft.SharePoint.Client.dll")
[void][Reflection.Assembly]::LoadFrom("c:\Microsoft.SharePoint.Client.Runtime.dll")

Step 2. Define the sharepoint site URL and the Document Library repository

You can simply enter the sharepoint URL by replacing the following $serverURL variable. Enter the document library by replacing the $DocumentLibrary variable and don’t forget to define the destination folder.

$serverURL = “http://sharepoint.url/sites/sitename”
$destination = "C:\temp\"
$DocumentLibary = "Document Library Name"

Step 3. Choose whether you only want specific folder to be downloaded from the Document Library

Change the folder name that you are interest in downloading, in the following example we are only interested in downloading folder “Payments” and all the folder underneath it.


function Parse-Lists ($Lists)
{
$clientContext.Load($Lists)
$clientContext.Load($Lists.RootFolder.Folders)
$clientContext.ExecuteQuery()
    
    foreach ($Folder in $Lists.RootFolder.Folders)
        {
            if ($Folder.name -eq "Payments"){   #onlydownload selected folder
                recurse $Folder
            }
        }

}

Step 4. Execute the script via PowerShell window or from Command line.

To execute the script via command line you can execute the following Powershell command, with the assumption the name of the powershell script is “scriptname.ps1”

C:\Powershell.exe scriptname.ps1

Here are the full script to download the Sharepoint Document library, be careful the script will download the entire document library recursively, so please make sure you check Step 3 above. With great power comes great responsibility.

# Load the SharePoint 2013 .NET Framework Client Object Model libraries. # 
[void][Reflection.Assembly]::LoadFrom("c:\Microsoft.SharePoint.Client.dll")
[void][Reflection.Assembly]::LoadFrom("c:\Microsoft.SharePoint.Client.Runtime.dll")
Clear-Host

$serverURL = “http://sharepoint.url/sites/sitename”
#$siteUrl = $serverURL+"/documents”
$destination = "C:\temp\"
$DocumentLibary = "Document Library Name"
$downloadEnabled = $true
$versionEnabled = $false

# Authenticate with the SharePoint Online site. # 
#$username = ""
#$Password = ""
#$securePassword = ConvertTo-SecureString $Password -AsPlainText -Force  

$clientContext = New-Object Microsoft.SharePoint.Client.ClientContext($serverURL) 
#$credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username, $securePassword) 
#$clientContext.Credentials = $credentials 
if (!$clientContext.ServerObjectIsNull.Value) 
{ 
    Write-Output "Connected to SharePoint Online site: '$serverURL'"
} 


function HTTPDownloadFile($ServerFileLocation, $DownloadPath)
{
#Download the file from the version's URL, download to the $DownloadPath location
    $webclient = New-Object System.Net.WebClient
    $webclient.credentials = $credentials
    Write-Output "Download From ->'$ServerFileLocation'"
    Write-Output "Write to->'$DownloadPath'"
    $webclient.Headers.Add("X-FORMS_BASED_AUTH_ACCEPTED", "f")
    $webclient.DownloadFile($ServerFileLocation,$DownloadPath)
}

function DownloadFile($theFile, $DownloadPath)
{
    $fileRef = $theFile.ServerRelativeUrl;
    Write-Host $fileRef;
    $fileInfo = [Microsoft.sharepoint.client.File]::OpenBinaryDirect($clientContext, $fileRef);
    $fileStream = [System.IO.File]::Create($DownloadPath)
    $fileInfo.Stream.CopyTo($fileStream);
    $fileStream.Close()
}

function Get-FileVersions ($file, $destinationFolder)
{
    $clientContext.Load($file.Versions)
    $clientContext.ExecuteQuery()
    foreach($version in $file.Versions)
    {
        #Add version label to file in format: [Filename]_v[version#].[extension]
        $filesplit = $file.Name.split(".") 
        $fullname = $filesplit[0] 
        $fileext = $filesplit[1] 
        $FullFileName = $fullname+"_v"+$version.VersionLabel+"."+$fileext           

        #Can't create an SPFile object from historical versions, but CAN download via HTTP
        #Create the full File URL using the Website URL and version's URL
        $ServerFileLocation = $siteUrl+"/"+$version.Url

        #Full Download path including filename
        $DownloadPath = $destinationfolder+"\"+$FullFileName
        
        if($downloadenabled) {HTTPDownloadFile "$ServerFileLocation" "$DownloadPath"}

    }
}

function Get-FolderFiles ($Folder)
{
    $clientContext.Load($Folder.Files)
    $clientContext.ExecuteQuery()

    foreach ($file in $Folder.Files)
        {

            $folderName = $Folder.ServerRelativeURL
            $folderName = $folderName -replace "/","\"
            $folderName = $destination + $folderName
            $fileName = $file.name
            $fileURL = $file.ServerRelativeUrl
            
                
            if (!(Test-Path -path $folderName))
            {
                $dest = New-Item $folderName -type directory 
            }
                
            Write-Output "Destination -> '$folderName'\'$filename'"

            #Create the full File URL using the Website URL and version's URL
            $ServerFileLocation = $serverUrl+$file.ServerRelativeUrl

            #Full Download path including filename
            $DownloadPath = $folderName + "\" + $file.Name
                    
            #if($downloadEnabled) {HTTPDownloadFile "$ServerFileLocation" "$DownloadPath"}
            if($downloadEnabled) {DownloadFile $file "$DownloadPath"}

            if($versionEnabled) {Get-FileVersions $file $folderName}
            
    }
}


function Recurse($Folder) 
{
       
    $folderName = $Folder.Name
    $folderItemCount = $folder.ItemCount

    Write-Output "List Name ->'$folderName'"
    Write-Output "Number of List Items->'$folderItemCount'"

    if($Folder.name -ne "Forms")
        {
            #Write-Host $Folder.Name
            Get-FolderFiles $Folder
        }
 
    Write-Output $folder.ServerRelativeUrl
 
    $thisFolder = $clientContext.Web.GetFolderByServerRelativeUrl($folder.ServerRelativeUrl)
    $clientContext.Load($thisFolder)
    $clientContext.Load($thisFolder.Folders)
    $clientContext.ExecuteQuery()
            
    foreach($subfolder in $thisFolder.Folders)
        {
            Recurse $subfolder  
        }       
}


function Parse-Lists ($Lists)
{
$clientContext.Load($Lists)
$clientContext.Load($Lists.RootFolder.Folders)
$clientContext.ExecuteQuery()
    
    foreach ($Folder in $Lists.RootFolder.Folders)
        {
            if ($Folder.name -eq "Payments"){   #onlydownload selected folder
                recurse $Folder
            }
        }

}

$rootWeb = $clientContext.Web
$LibLists = $rootWeb.lists.getByTitle($DocumentLibary)
$clientContext.Load($rootWeb)
$clientContext.load($LibLists)
$clientContext.ExecuteQuery()

Parse-Lists $LibLists

 

Please let me know if the above script is useful, feel free to subscribe to my blog, share this script or ask me any questions related to the script.