Tuesday, November 19, 2013

Steps to Install SharePoint 2010 using PowerShell


1) Download the spmodule from Microsoft site. (http://www.microsoft.com/en-in/download/details.aspx?id=6194)

2) Change the names according to the your current environment

3) Execute the spbinaries.ps1 script first

4) Then spconfigwizard.ps1 script
 

Reference links:

 




 

SPBINARIES.PS1

 

#SharePoint Binaries Installation

 #Importing PowerShell Module for SharePoint 2010

 Write-Host -ForegroundColor Blue "Importing SharePoint PowerShell Binaries"

 $env:PSModulePath = $env:PSModulePath + ";D:\Softwares\SPModule"

 Import-Module SPModule.misc

 Import-Module SPModule.setup

 Write-Host -ForegroundColor Blue "Modules have been imported"

 Write-Host -ForegroundColor Blue "**************************"

 Write-Host -ForegroundColor Blue "Installing SharePoint Binaries"

 Install-SharePoint -SetupExePath "D:\SharePoint\Install\setup.exe" -PIDKey "6VCWT-QBQVD-HG7KD-8BW8C-PBX7T"

 Write-Host -ForegroundColor Blue "Finished Installing SharePoint Binaries"

 Write-Host -ForegroundColor Blue "**************************"

 

SPCONFIGWIZARD.PS1

 

# use SP-farm account credentials

  Add-PSSnapin Microsoft.SharePoint.Powershell

  $username = 'jda\jnetsvcDevfarm'

  $password = 'Z6aJZaIR'

  $password = (ConvertTo-SecureString -String "$password" -AsPlainText -force)

  $cred = New-Object System.Management.Automation.PsCredential $username,$password

  New-SPConfigurationDatabase -DatabaseName "SP2010_Config" -DatabaseServer "MD1PRDSPSSQLDR" -AdministrationContentDatabaseName "SP2010_CentralAdmin" -Passphrase (ConvertTo-SecureString "FarmAcctPassw0rd!" -AsPlainText -force) -FarmCredentials $cred

  Install-SPHelpCollection -All

  Initialize-SPResourceSecurity

  Install-SPService

  Install-SPFeature -AllExistingFeatures

  New-SPCentralAdministration -Port 17012 -WindowsAuthProvider "NTLM"

  Install-SPApplicationContent

  # Set-SPDiagnosticConfig -LogLocation "D:\Logs"

 

 

 

Friday, November 1, 2013

Script to delete files older than 5 Days

get-childitem -Path "D:\Backup\SiteCollection" |
    where-object {$_.LastWriteTime -lt (get-date).AddDays(-5)} |
    remove-item -Force -Recurse

Thursday, October 31, 2013

SSP - OSearch Stopping in MOSS


To stop the osearch in MOSS 2007 with force, steps to resolve:
·         Get the process list by typing “tasklist” at the command prompt.

·         Note down the process id of mssearch.exe    

·         Execute following command as shown in the below image in the command prompt.

 




This will kill the osearch process.

 

 

Tuesday, October 29, 2013

Deep Search Crawl Concept - SharePoint 2007


Crawling SharePoint sites using the SPS3 protocol handler


When you setup your content sources in a Microsoft Office SharePoint Server (MOSS 2007), you have a few options to choose from: SharePoint Sites, Web Sites, File Shares, Exchange Public Folders and Business Data. When you use the SharePoint Sites option, you're instructing the indexer to crawl a WSS web front end and you will use sps3:// as the prefix for your start address. This tells the crawler to use a SharePoint-specific protocol handler to enumerate the content and then grab the actual items from the SharePoint server.

A common question here is whether this uses some sort of RPC call into the SharePoint Web Front End (WFE) server. The answer is "no". People asking the question are usually trying to configure the firewalls between a indexer and a MOSS WFE and need to know what TCP/IP ports they need to open. You should be fine with just HTTP (or HTTPS, if your portal requires that). The SPS3 protocol handler uses a web services call (using HTTP/SOAP) to enumerate the content and then uses regular HTTP GET requests to get to the actual items. Crawling using the SPS3 protocol handler requires no RPC calls or direct database access to the target farm. That's the main reason why this type of crawling is supported over WAN links and has a good tolerance to latency.

If you want to confirm this, configure two separate MOSS farms and have one crawl the other:

  • Configure a new content source using Central Administration, Shared Services, Search Settings, Content Sources, Add Content Source.
  • Specify SharePoint sites as the type and use SPS3://servername as the start address
  • Start a full crawl

If you have any network monitoring hardware or software, you will notice that one the first things the crawler will do is use the "Portal Crawl" web service at http://servername/_vti_bin/spscrawl.asmx. The methods in this web service are EnumerateBucket, EnumerateFolder, GetBucket, GetItem and GetSite. It is interesting to see how both "Enumerate" methods will basically return just an "ID" and a "LastModified" datetime, hinting at how SharePoint can do incremental content crawls via this protocol handler... If you just point your browser to that URL yourself, you can find the additional information about the web service, including sample SOAP calls and the WSDL (as you get with any .NET web service). At this point, I could not find much detail on this web service beyond the actual class definition for Microsoft.Office.Server.Search.Internal.Protocols.SPSCrawl.

Wednesday, October 23, 2013

SQL Query to find Last Access Date for a site ollection in Content Database

SELECT
FullUrl AS 'Site URL', TimeCreated,

DATEADD
(d,DayLastAccessed + 65536, CONVERT(datetime, '1/1/1899', 101))

AS
lastAccessDate FROM Webs WHERE

(
DayLastAccessed <> 0) AND (FullUrl LIKE N'sites/%') ORDER BY lastAccessDate

Steps to run the commands in client machine for the scripts that are there in Server


First and foremost thing is we need to set execute policy to bypass in both client and server machines.

 

Script that we need to execute in the client machine:

 

param ($DropLocation="\\MD1DEVVSPEAPP01\Scripts", $Server="MD1DEVVSPEAPP01")

$secpasswd = ConvertTo-SecureString "68qZxpTi" -AsPlainText -Force;

$cred = New-Object System.Management.Automation.PSCredential ("JDA\jnetsvcQAfarm", $secpasswd);

Invoke-Command -Args ($DropLocation) -Script {

                param($DropLocation)                

                Add-PSSnapin Microsoft.SharePoint.Powershell;

                $script = Join-Path $DropLocation "SPServicesStart.ps1"

                . $script 

} -ComputerName $Server -auth CredSSP -cred $cred

 

 

Wednesday, September 25, 2013

Networking: essential commands for SharePoint

The following list of the network commands will help to understand where you machine is and how it's routed.
So, run MS-Dos window and enjoy:


1. notorious   - ping sends out a packet to a designated internet host or network computer and measures its response time. The target computer will return (hopefully) a signal. 
2.  ipconfig  (add /all to get a full picture) -  It is used to display the TCP/IP network configuration values (assigned IP, gateway and DNS server)
3.  tracert - The actual path between two computers on the Internet is not a straight line but consists of numerous segments or "hops" from one intermediate computer to another. Tracert shows each step of the path taken. 


4. pathping -   this command combines functions of Ping and TracertPathping will first list the number of hops required to reach the address you are testing and then send multiple pings to each router between you and the destination. 


5. netstat - displays the active TCP connections and ports on which the computer is listening, Ethernet statistics, the IP routing table, statistics for the IP, ICMP, TCP, and UDP protocols. 

The command "netstat -a" will display all your connections. The command "netstat -b" will show the executable files involved in creating a connection. A figure showing all the switches and syntax is given below


5. nslookupThis command helps diagnose the Domain Name System (DNS) infrastructure.

Sync Issue between SSP and Content DB in MOSS


I have gone through the issue where connection is broken between ssp and content database(sp site) in terms of sync.
Following SQL Query is to find the where the link is broken between ssp and content database

SELECT* FROM [WSS_Content_SharedServices_DB_PROD].[dbo].[SiteSynch] Where Moving ='1'      ( true is nothing but 1 )

·         The link between Shared Services Provider (SSP) and the web application is broken

o    Due to "Moving" attribute set to 'True' in dbo.SiteSynch table in Shared Services database due to incorrect usage of stsadm -o preparetomovecommand


 
Output of the Query:


ContentDBID
SiteID
LastSynch
Change Token
SchemaVersion
LastChangeSynchSuccess
Moving
MovingDeleted
Registered
B9AAB4E5-C1AA-4B03-A1BC-9FC08A0ED42E
C5FED6F5-415A-4585-BCE5-11ADE6164504
00:01.5
1;0;b9aab4e5-c1aa-4b03-a1bc-9fc08a0ed42e;635156712099800000;36269
66
1
1
1
0
96CC1FCD-5EAE-48F4-B335-1FFD5944197F
4C8D4AF8-70BF-4EB3-971B-BA94EC260840
00:03.8
NULL
66
0
1
1
0

 
I have found 2 sites in "moving state", which means the link is broken between ssp and above two sites. So I have figured out the site collection name with help of above site id and executed the preparetomove command with undo parameter ( which will restart the sync process between ssp and site collection.

 
Query to find the site collection name and database name to find the GUIDs

 
·         SELECT ID, Name

FROM [Name_of_Config_DB].[dbo].[Objects]

WHERE ID = ' GUID_of_Database'

·         SELECT [Id], [SiteId], [FullUrl]

FROM [Name_of_Content_DB].[dbo].[Objects]

     WHERE ID = 'GUID_of_Site_Collection'

Once above is done executed following stsadm command (prepare to move with undo parameter) which restarted the sync service:
 
 

 
Reference link :

 

Sunday, September 15, 2013

SharePoint - Search Crawl Sequential Process


  1. All the SharePoint Content Source information like site, crawl rules,etc... is stored in Search Administration DB in SQL Server
  2. And the same information (start address) is replicated in to the registry of crawl servers.
  3. When search administration starts the full crawl the copies the registry from crawl server to the queue in the crawl database (in sql server) and admin component will assign crawl server based on the registry information and start address information and then crawl will get started.
  4. Crawl Server Determines what protocol to be used based on the type of site mentioned in content source information
  5. Then crawl server will start crawling the content ,based on the I filters systems crawls the content ,at this point data is crawled but not indexed so indexing engine will index the crawl content and make it available in full text indexed data.
  6. Now this indexed data is copied to that query server designated in search service application and if we have mirror index partitions the data is copied to the mirror server.
  7. When copy of indexed data is moved to query server, the data is deleted from crawl server
  8. Then finally crawl server will send the details like url,meta data information to the crawl and property databases by using http connector. This is the final state of crawling.

Friday, September 13, 2013

CheckPoints - SharePoint Migration

Would like to specify some points here in which Areas I have faced issues during DB Migration ,I strongly feel that it will be great assistance if we can check the following areas before starting any migration activity.
(Types of Migrations includes like version to version, or with in same version of SharePoint)


  • Update Email Alerts
  • Incoming Email lists
  • Intranet ,extranet urls and broken links
  • Navigation
  • User permissions
  • Versioning issues
  • Space issues in all servers
  • Network speed to copy data between domains
  • Authentication
  • Solutions
  • Orphaned site collections/content databases

Deleting Orphaned Site Collection

Here is the quick step on how to delete the orphaned site collection as final resolution if any of the usual methods did not work :
 
To delete the site collection by using sql query which should be executed in the configuration database

 

DELETE FROM SiteMap


WHERE Id = 'a97953bd-6305-4eb0-872b-4fb4009373f6'    id is the site id’

SharePoint 2013 - ADFS - Configuration

The main objective of this post is to provide detailed configuration steps on how to set up SAML Authentication for SharePoint 2013/2016 w...