Wednesday, September 25, 2013

Networking: essential commands for SharePoint

The following list of the network commands will help to understand where you machine is and how it's routed.
So, run MS-Dos window and enjoy:


1. notorious   - ping sends out a packet to a designated internet host or network computer and measures its response time. The target computer will return (hopefully) a signal. 
2.  ipconfig  (add /all to get a full picture) -  It is used to display the TCP/IP network configuration values (assigned IP, gateway and DNS server)
3.  tracert - The actual path between two computers on the Internet is not a straight line but consists of numerous segments or "hops" from one intermediate computer to another. Tracert shows each step of the path taken. 


4. pathping -   this command combines functions of Ping and TracertPathping will first list the number of hops required to reach the address you are testing and then send multiple pings to each router between you and the destination. 


5. netstat - displays the active TCP connections and ports on which the computer is listening, Ethernet statistics, the IP routing table, statistics for the IP, ICMP, TCP, and UDP protocols. 

The command "netstat -a" will display all your connections. The command "netstat -b" will show the executable files involved in creating a connection. A figure showing all the switches and syntax is given below


5. nslookupThis command helps diagnose the Domain Name System (DNS) infrastructure.

Sync Issue between SSP and Content DB in MOSS


I have gone through the issue where connection is broken between ssp and content database(sp site) in terms of sync.
Following SQL Query is to find the where the link is broken between ssp and content database

SELECT* FROM [WSS_Content_SharedServices_DB_PROD].[dbo].[SiteSynch] Where Moving ='1'      ( true is nothing but 1 )

·         The link between Shared Services Provider (SSP) and the web application is broken

o    Due to "Moving" attribute set to 'True' in dbo.SiteSynch table in Shared Services database due to incorrect usage of stsadm -o preparetomovecommand


 
Output of the Query:


ContentDBID
SiteID
LastSynch
Change Token
SchemaVersion
LastChangeSynchSuccess
Moving
MovingDeleted
Registered
B9AAB4E5-C1AA-4B03-A1BC-9FC08A0ED42E
C5FED6F5-415A-4585-BCE5-11ADE6164504
00:01.5
1;0;b9aab4e5-c1aa-4b03-a1bc-9fc08a0ed42e;635156712099800000;36269
66
1
1
1
0
96CC1FCD-5EAE-48F4-B335-1FFD5944197F
4C8D4AF8-70BF-4EB3-971B-BA94EC260840
00:03.8
NULL
66
0
1
1
0

 
I have found 2 sites in "moving state", which means the link is broken between ssp and above two sites. So I have figured out the site collection name with help of above site id and executed the preparetomove command with undo parameter ( which will restart the sync process between ssp and site collection.

 
Query to find the site collection name and database name to find the GUIDs

 
·         SELECT ID, Name

FROM [Name_of_Config_DB].[dbo].[Objects]

WHERE ID = ' GUID_of_Database'

·         SELECT [Id], [SiteId], [FullUrl]

FROM [Name_of_Content_DB].[dbo].[Objects]

     WHERE ID = 'GUID_of_Site_Collection'

Once above is done executed following stsadm command (prepare to move with undo parameter) which restarted the sync service:
 
 

 
Reference link :

 

Sunday, September 15, 2013

SharePoint - Search Crawl Sequential Process


  1. All the SharePoint Content Source information like site, crawl rules,etc... is stored in Search Administration DB in SQL Server
  2. And the same information (start address) is replicated in to the registry of crawl servers.
  3. When search administration starts the full crawl the copies the registry from crawl server to the queue in the crawl database (in sql server) and admin component will assign crawl server based on the registry information and start address information and then crawl will get started.
  4. Crawl Server Determines what protocol to be used based on the type of site mentioned in content source information
  5. Then crawl server will start crawling the content ,based on the I filters systems crawls the content ,at this point data is crawled but not indexed so indexing engine will index the crawl content and make it available in full text indexed data.
  6. Now this indexed data is copied to that query server designated in search service application and if we have mirror index partitions the data is copied to the mirror server.
  7. When copy of indexed data is moved to query server, the data is deleted from crawl server
  8. Then finally crawl server will send the details like url,meta data information to the crawl and property databases by using http connector. This is the final state of crawling.

Friday, September 13, 2013

CheckPoints - SharePoint Migration

Would like to specify some points here in which Areas I have faced issues during DB Migration ,I strongly feel that it will be great assistance if we can check the following areas before starting any migration activity.
(Types of Migrations includes like version to version, or with in same version of SharePoint)


  • Update Email Alerts
  • Incoming Email lists
  • Intranet ,extranet urls and broken links
  • Navigation
  • User permissions
  • Versioning issues
  • Space issues in all servers
  • Network speed to copy data between domains
  • Authentication
  • Solutions
  • Orphaned site collections/content databases

Deleting Orphaned Site Collection

Here is the quick step on how to delete the orphaned site collection as final resolution if any of the usual methods did not work :
 
To delete the site collection by using sql query which should be executed in the configuration database

 

DELETE FROM SiteMap


WHERE Id = 'a97953bd-6305-4eb0-872b-4fb4009373f6'    id is the site id’

SharePoint 2013 - ADFS - Configuration

The main objective of this post is to provide detailed configuration steps on how to set up SAML Authentication for SharePoint 2013/2016 w...