SiteMap, website map, is very useful when building a website. It can be bound directly to the Men and TreeView controls, and there is a SiteMapPath control indicating the current path, which can also be bound directly.
This is his commonly used XML definition:
<siteMapNode url="Course/Group/GroupList.aspx" title="GroupAdmin" >
The permissions of this SiteMap have been combined with Membership, and the maps seen by users with different permissions have been controlled. The role attribute can be configured to extend the access permissions for exceptions. Note that this is an exception to the access permission.
<siteMapNode url="Course/Tests/TestList.aspx" title="TestAdmin" role="student">Here are some introductions :
http://zmsx.cnblogs.com/archive/2006/01/03/310381.aspxSimple The use will not be described in detail here, but we will discuss how to extend it so that it can access resources with parameters.
First introduce such a resource: MySiteMapTool: http://quitgame.cnblogs.com/archive/2005/11/24/283910.aspx
This friend has provided a tool that can forward requests with parameters in the program, such as: MySiteMap.Forward("Details", "AlbumID={0}&Page={1}", 1, 4);
It is indeed simple and practical.
The function we want now is: because each liquid level requires different parameters, in the absence of these parameters, the user is prohibited from accessing that page and instead accesses the parent page, recursively.
First of all, SiteMap itself has a SiteMapResolve event, which is triggered when the current path is parsed. This is a piece of code from MSDN
private void Page_Load(object sender, EventArgs e)
{
// The ExpandForumPaths method is called to handle
// the SiteMapResolve event.
SiteMap.SiteMapResolve +=
new SiteMapResolveEventHandler(this.ExpandForumPaths);
}
private SiteMapNode ExpandForumPaths(Object sender, SiteMapResolveEventArgs e)
{
// The current node represents a Post page in a bulletin board forum.
// Clone the current node and all of its relevant parents. This
// returns a site map node that a developer can then
// walk, modifying each node.Url property in turn.
// Since the cloned nodes are separate from the underlying
// site navigation structure, the fixups that are made do not
// effect the overall site navigation structure.
SiteMapNode currentNode = SiteMap.CurrentNode.Clone(true);
SiteMapNode tempNode = currentNode;
// Obtain the recent IDs.
int forumGroupID = GetMostRecentForumGroupID();
int forumID = GetMostRecentForumID(forumGroupID);
int postID = GetMostRecentPostID(forumID);
// The current node, and its parents, can be modified to include
// dynamic querystring information relevant to the currently
// executing request.
if (0 != postID)
{
tempNode.Url = tempNode.Url + "?PostID=" + postID.ToString();
}
if ((null != (tempNode = tempNode.ParentNode)) &&
(0 != forumID))
{
tempNode.Url = tempNode.Url + "?ForumID=" + forumID.ToString();
}
if ((null != (tempNode = tempNode.ParentNode)) &&
(0 != forumGroupID))
{
tempNode.Url = tempNode.Url + "?ForumGroupID=" + forumGroupID.ToString();
}
return currentNode;
}
This code just loads parameters for the current path.
I have tried using a similar method, but once SiteMapPath was fixed, Menu could not be bound to data. And only a portion of the data can be processed.
Later, combined with the SiteMapTool class, I wrote several functions to solve this problem. This is the modified sitemap file, with a configuration item added: rule, and the parameters inside are the parameters required for this page. If the current context does not have these parameters, the user is prohibited from accessing this page.
<siteMapNode url="Course/Group/GroupDetail.aspx" title="Group Detail" rule="cid;gid">
These are two functions that process all paths recursively. private string MakeURL(SiteMapNode node)
{
node.ReadOnly = false;
//find the static url
string url = MySiteMap.FindForward(node.Title);
if (node["rule"] != null && node["rule"].Length > 0)
{
//if have the rule,then check
string[] paramSet = node["rule"].Split(';');
//check
for (int i = 0; i < paramSet.Length; i++)
{
//if request have not such a param, then invoke self to check his parent
if (HttpContext.Current.Request.Params[paramSet[i]] == null)
return MakeURL(node.ParentNode);
}
//if pass ,then add all the params and return the value
url += "?";
for (int i = 0; i < paramSet.Length; i++)
{
string key = paramSet[i];
//'cid'--->'cid=1'. the former format is like : rule='cid;tid'
url = url + key + "=" + HttpContext.Current.Request.Params[key] + "&";
}
return url.Substring(0, url.Length - 1); //remove last '&'
}
else
{
//if there is no rule then return the url directly
return url;
}
} private void ReBindData(SiteMapNode root)
{
string url = MakeURL(root);
if (url != "")
root.Url = url;
for (int i = 0; i < root.ChildNodes.Count; i++)
{
ReBindData(root.ChildNodes[i]);
}
}Call the MakeUrl function recursively in ReBindData.
The MySiteMap.FindForward function called in the MakeUrl function comes from the implementation of http://quitgame.cnblogs.com/archive/2005/11/24/283910.aspx .
However, some changes need to be made after the application: its original implementation uses static classes to load like this
//SiteMapNodeCollection smc = SiteMap.RootNode.GetAllNodes();
//siteMapCol = new NameValueCollection();
//IEnumerator ie = smc.GetEnumerator();
//while (ie.MoveNext())
//{
// siteMapCol[((SiteMapNode)ie.Current).Title] = ((SiteMapNode)ie.Current).Url;
//} However, since the user is limited by permissions when not logged in, the pages it can access are limited, so SiteMap.RootNode.GetAllNodes(); does not get all the data, it may only be part or 0.
The way to change it is to write a function yourself, read the xml file directly, and obtain all data definitions recursively.
Source: BLOG Do whatever you want