Search Results

Search found 91480 results on 3660 pages for 'large data in sharepoint list'.

Page 7/3660 | < Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >

  • Managing SharePoint permissions via Active Directory?

    - by rgmatthes
    My company has thousands of employees organized thoroughly via Active Directory. I have confidence in the accuracy of the Department and Title information displayed in the user profiles. I'm helping to put up a brand new SharePoint 2007 site, and I contacted IT about managing the site's permissions through AD Groups. The goal is to have the site automatically assign read/write/contribute/whatever permissions based on the information in AD. For example, we could create an AD Group called "Managers" that would contain anyone with the "Manager" title in their AD user profile. I would have SharePoint tap into this AD Group to mass assign permissions if I knew all managers would need a certain level of access (read/write/contribute/whatever). Then if a manager joins the company or leaves it, the group is automatically updated (provided AD gets updated, of course). My IT rep called back and said it couldn't be done. This seems like a pretty straightforward business requirement, and one of the huge benefits of having Active Directory, but maybe I'm mistaken. Could anyone shed some light on this? A) Is it possible to use dynamically-updated AD Groups when assigning permissions via SharePoint? (Does anyone know of a guide I could show my doubtful IT rep?) B) Is there a "best practice" way to go about this? I've read some debate on whether SharePoint Groups or AD Groups are the way to go. My main concern is dynamic updating. C) If this isn't available out of the box, can someone recommend third-party software that will provide the functionality I'm looking for? A big thanks to anyone who can help me out!!

    Read the article

  • Data view web part throwing error

    - by Ashutosh Singh
    Hi I'm using a xslt based dataview webpart the steps i have taken to create a data view webpart is that 1. added a list view webpart on the page 2. Modified the toolbar property to show fulll toolbar 3. open the web page containing above list view webpart in sharepoint desginer and converted it to xslt based webpart (to make further changes in UI) 4. saved the page and previewed in browser in browser web part was throwing the error while i was able to see it properly in desginer witout any error the error maesseged shown in webpart was: ** Unable to display this Web Part. To troubleshoot the problem, open this Web page in a Windows SharePoint Services-compatible HTML editor such as Microsoft Office SharePoint Designer. If the problem persists, contact your Web server administrator. ** the error message provided in sharepoint log file was : 05/12/2010 17:56:29.54 w3wp.exe (0x19FC) 0x1E9C Windows SharePoint Services Web Parts 89a1 Monitorable Error while executing web part: System.NullReferenceException: Object reference not set to an instance of an object. at Microsoft.SharePoint.WebPartPages.DataFormWebPart.ResolveParameterValuesToXsl(ArgumentClassWrapper argList) at Microsoft.SharePoint.WebPartPages.DataFormWebPart.PrepareAndPerformTransform() 05/12/2010 17:56:29.62 w3wp.exe (0x19FC) 0x1E9C Windows SharePoint Services Web Controls 88wy Medium SPDataSourceView.ExecuteSelect() - selectArguments: IsEmpty=True, MaximumRows=0, RetrieveTotalRowCount=False, SortExpression=, StartRowIndex=0, TotalRowCount=-1 05/12/2010 17:56:29.62 w3wp.exe (0x19FC) 0x1E9C Windows SharePoint Services Web Controls 88x2 Medium SPDataSourceView.ExecuteSelect() - formattedQuery = 1 05/12/2010 17:56:29.64 w3wp.exe (0x19FC) 0x1E9C Windows SharePoint Services Web Parts 89a1 Monitorable Error while executing web part: System.NullReferenceException: Object reference not set to an instance of an object. at Microsoft.SharePoint.WebPartPages.DataFormWebPart.ResolveParameterValuesToXsl(ArgumentClassWrapper argList) at Microsoft.SharePoint.WebPartPages.DataFormWebPart.PrepareAndPerformTransform()

    Read the article

  • Deploying Data Mining Models using Model Export and Import

    - by [email protected]
    In this post, we'll take a look at how Oracle Data Mining facilitates model deployment. After building and testing models, a next step is often putting your data mining model into a production system -- referred to as model deployment. The ability to move data mining model(s) easily into a production system can greatly speed model deployment, and reduce the overall cost. Since Oracle Data Mining provides models as first class database objects, models can be manipulated using familiar database techniques and technology. For example, one or more models can be exported to a flat file, similar to a database table dump file (.dmp). This file can be moved to a different instance of Oracle Database EE, and then imported. All methods for exporting and importing models are based on Oracle Data Pump technology and found in the DBMS_DATA_MINING package. Before performing the actual export or import, a directory object must be created. A directory object is a logical name in the database for a physical directory on the host computer. Read/write access to a directory object is necessary to access the host computer file system from within Oracle Database. For our example, we'll work in the DMUSER schema. First, DMUSER requires the privilege to create any directory. This is often granted through the sysdba account. grant create any directory to dmuser; Now, DMUSER can create the directory object specifying the path where the exported model file (.dmp) should be placed. In this case, on a linux machine, we have the directory /scratch/oracle. CREATE OR REPLACE DIRECTORY dmdir AS '/scratch/oracle'; If you aren't sure of the exact name of the model or models to export, you can find the list of models using the following query: select model_name from user_mining_models; There are several options when exporting models. We can export a single model, multiple models, or all models in a schema using the following procedure calls: BEGIN   DBMS_DATA_MINING.EXPORT_MODEL ('MY_MODEL.dmp','dmdir','name =''MY_DT_MODEL'''); END; BEGIN   DBMS_DATA_MINING.EXPORT_MODEL ('MY_MODELS.dmp','dmdir',              'name IN (''MY_DT_MODEL'',''MY_KM_MODEL'')'); END; BEGIN   DBMS_DATA_MINING.EXPORT_MODEL ('ALL_DMUSER_MODELS.dmp','dmdir'); END; A .dmp file can be imported into another schema or database using the following procedure call, for example: BEGIN   DBMS_DATA_MINING.IMPORT_MODEL('MY_MODELS.dmp', 'dmdir'); END; As with models from any data mining tool, when moving a model from one environment to another, care needs to be taken to ensure the transformations that prepare the data for model building are matched (with appropriate parameters and statistics) in the system where the model is deployed. Oracle Data Mining provides automatic data preparation (ADP) and embedded data preparation (EDP) to reduce, or possibly eliminate, the need to explicitly transport transformations with the model. In the case of ADP, ODM automatically prepares the data and includes the necessary transformations in the model itself. In the case of EDP, users can associate their own transformations with attributes of a model. These transformations are automatically applied when applying the model to data, i.e., scoring. Exporting and importing a model with ADP or EDP results in these transformations being immediately available with the model in the production system.

    Read the article

  • Hover effect for parent unordered list inherited by child list

    - by elvista
    I have a simple menu <ul id="menu"> <li class="leaf"><a href="#">Menu Item 1</a></li> <li class="leaf"><a href="#">Menu Item 2</a></li> <li class="expanded"><a href="#">Menu Item 3</a> <ul> <li class="leaf"><a href="#">Menu Item a</a></li> <li class="leaf"><a href="#">Menu Item b</a></li> <li class="leaf"><a href="#">Menu Item c</a></li> </ul> </li> <li class="leaf"><a href="#">Menu Item 4</a></li> </ul> and ul#menu li a:hover {font-weight:bold;} The problem I am facing is when I hover above a ul li li, the parent as well as all its siblings gets the hover effect. I only want the list item I hovered above to get the effect. I tried ul#menu li.leaf a: hover {..}, ul#menu li.expanded a: hover {..} , but even in that case, when I hover above li.expanded, it's child inherits the style. How do I fix this?

    Read the article

  • SharePoint 2010 Hosting :: SharePoint 2010 Custom Web Template

    - by mbridge
    SharePoint 2010 offers some changes and additions to the SharePoint 2007 approach. Site definitions and publishing providers remain largely the same, but site templates created from the SharePoint UI or SharePoint Designer are now saved to a .WSP file, the same solution deployment packaging file format used for deploying custom SharePoint solutions. Site Templates saved to a .WSP solution file can be imported into Visual Studio for additional customization. Introducing the WebTemplate Feature Element The WebTemplate element, introduced in SharePoint 2010, allows site templates to be defined and deployed as a Feature as part of a solution package. A WebTemplate element feature can be used to deploy site templates in either a Farm or Sandbox solution - without modification. If deployed as a Farm feature and solution, site templates will appear in the site collection provisioning page in Central Administration and can be used to provision new site collections, or within a Site Collection to create sub-sites. If deployed as a Site feature and Sandbox solution, site templates will appear within the site collection to support creating a root site or sub-sites. Creating a new WebTemplate Feature in Visual Studio 2010 In addition to supporting the ability to save and import Site Templates created from the SharePoint UI into Visual Studio for customization, it can also be used to create new site templates from scratch. In the following sample we will walk through how to create a new WebTemplate solution based on  a customized version of the out-of-box Blank Site. 1. Create a new Empty SharePoint Project in Visual Studio 2010. 2. Add a new Empty Element to the project. we like to create folders for each type of element in our solution, so in our sample, we have created a Web Templates folder, and then added the BLANKENT element. NOTE: The Elements folder MUST share the same name as the WebTemplate name property. 3. Open the empty Elements.xml and add the <WebTemplate /> element block. 4. Copy the default.aspx and ONET.XML files from the STS site definition location at 14\TEMPLATES\Site Templates\STS. We will customize the ONET.XML in the next section. Open the properties for each file and set the Deployment Type to ElementFile. This ensures the files are deployed with the Element when included in a Feature. 5. By default a new feature is added to the solution for you automatically when a new element is added to the solution. Rename and edit the feature as appropriate. Select Farm for the scope to deploy the WebTemplate to the entire farm, or Site for a sandboxed solution. Customize the ONET.XML At this point, you have a working WebTemplate solution that will deploy the identical site to the out-of-box Blank Site, however the ONET.XML supporting the STS site definition contains 3 configurations – essentially 3 separate site templates and can be simplified before customizing. In the following sample, we have trimmed the ONET.XML to the essentials for a single Site Template, and added references to the <SiteFeatures /> and <WebFeatures /> elements to include the SharePoint Standard and Enterprise features. We have left the top-level navigation bar, and the default page module intact, but removed all other extraneous markup.

    Read the article

  • SharePoint 2010 deployment problem after added a new server to existing farm

    - by mrt
    I have SharePoint 2010 farm with one server. I'm developing some features in a sharepoint farm solution (not sandbox because there are some user rights problem). All feature scopes are set to "Site". I can deploy the solution to SharePoint with no problem. I added a new web front-end server to my existing farm. Then when I try deploy my solution, VS2010 shows this error: Error occurred in deployment step 'Activate Features': Feature with Id 'xxx' is not installed in this farm, and cannot be added to this scope I login with AD administrator account to development server. Administrator account is in site collection admins on the target web application. The farm account is in local administrators group. Is there a solution for this error?

    Read the article

  • WCF/ADO.NET Data Services - Could not load type 'System.Data.Services.Providers.IDataServiceUpdatePr

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). When you try accessing ListData.svc, do you get the following error? Could not load type 'System.Data.Services.Providers.IDataServiceUpdateProvider' from assembly 'System.Data.Services, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'. Well, if you followed the instructions in Chapter 1 of my book to build your VM, you wouldn’t run into the above issue. But if you do, you need to install  -   For Windows Vista and Windows 2008 - http://www.microsoft.com/downloads/details.aspx?familyid=4B710B89-8576-46CF-A4BF-331A9306D555&displaylang=en For Windows 7 and Windows 2008 R2 - http://www.microsoft.com/downloads/details.aspx?familyid=79d7f6f8-d6e9-4b8c-8640-17f89452148e&displaylang=en Remember to: a) Install the x64 version, and b) Do an IISReset before trying again. Comment on the article ....

    Read the article

  • Why Oracle Data Integrator for Big Data?

    - by Mala Narasimharajan
    Big Data is everywhere these days - but what exactly is it? It’s data that comes from a multitude of sources – not only structured data, but unstructured data as well.  The sheer volume of data is mindboggling – here are a few examples of big data: climate information collected from sensors, social media information, digital pictures, log files, online video files, medical records or online transaction records.  These are just a few examples of what constitutes big data.   Embedded in big data is tremendous value and being able to manipulate, load, transform and analyze big data is key to enhancing productivity and competitiveness.  The value of big data lies in its propensity for greater in-depth analysis and data segmentation -- in turn giving companies detailed information on product performance, customer preferences and inventory.  Furthermore, by being able to store and create more data in digital form, “big data can unlock significant value by making information transparent and usable at much higher frequency." (McKinsey Global Institute, May 2011) Oracle's flagship product for bulk data movement and transformation, Oracle Data Integrator, is a critical component of Oracle’s Big Data strategy. ODI provides automation, bulk loading, and validation and transformation capabilities for Big Data while minimizing the complexities of using Hadoop.  Specifically, the advantages of ODI in a Big Data scenario are due to pre-built Knowledge Modules that drive processing in Hadoop. This leverages the graphical UI to load and unload data from Hadoop, perform data validations and create mapping expressions for transformations.  The Knowledge Modules provide a key jump-start and eliminate a significant amount of Hadoop development.  Using Oracle Data Integrator together with Oracle Big Data Connectors, you can simplify the complexities of mapping, accessing, and loading big data (via NoSQL or HDFS) but also correlating your enterprise data – this correlation may require integrating across heterogeneous and standards-based environments, connecting to Oracle Exadata, or sourcing via a big data platform such as Oracle Big Data Appliance. To learn more about Oracle Data Integration and Big Data, download our resource kit to see the latest in whitepapers, webinars, downloads, and more… or go to our website on www.oracle.com/bigdata

    Read the article

  • Why does everybody hate SharePoint?

    - by Ryan Michela
    Reading this topic about the most over hyped technologies I noticed that SharePoint is almost universally reviled. My experience with SharePoint (especially the most recent versions) is that it accomplishes it's core competencies smartly. Namely: Centralized document repository - get all those office documents out of email (with versioning) User-editible content creation for internal information disemination - look, an HR site with current phone numbers and the vacation policy Project collaboration - a couple clicks creates a site with a project's documents, task list, simple schedule, threaded discussion, and possibly a list of all project related emails. Very basic business automation - when you fill out the vacation form, an email is sent to HR. My experience is that SharePoint only gets really ugly when an organization tries to push it in a direction it isn't designed for. SharePoint is not a CRM, ERP, bug database or external website. SharePoint is flexible enough to serve in a pinch, but it is no replacement for a dedicated tool. (Microsoft is just as guilty of pushing SharePoint into domains it doesn't belong.) If you use SharePoint for what it's designed for, it really does work. Thoughts?

    Read the article

  • SharePoint: UI for ordering list items

    - by svdoever
    SharePoint list items have in the the base Item template a field named Order. This field is not shown by default. SharePoint 2007, 2010 and 2013 have a possibility to specify the order in a UI, using the _layouts page: {SiteUrl}/_layouts/Reorder.aspx?List={ListId} In SharePoint 2010 and 2013 it is possible to add a custom action to a list. It is possible to add a custom action to order list items as follows (SharePoint 2010 description): Open SharePoint Designer 2010 Navigate to a list Select Custom Actions > List Item Menu Fill in the dialog box: Open List Settings > Advanced Settings > Content Types, and set Allow management of content types to No  On List Settings select Column Ordering This results in the following UI in the browser: Selecting the custom Order Items action (under List Tools > Items) results in: You can change your custom action in SharePoint designer. On the list screen in the bottom right corner you can find the custom action: We now need to modify the view to include the order by adding the Order field to the view, and sorting on the Order field. Problem is that the Order field is hidden. It is possible to modify the schema of the Order field to set the Hidden attribute to FALSE. If we don’t want to write code to do it and still get the required result it is also possible to modify the view through SharePoint Designer: Modify the code of the view: This results in: Note that if you change the view through the web UI these changes are partly lost. If this is a problem modify the Order field schema for the list.

    Read the article

  • There is no web named - Sharepoint Event Hander

    - by Roosh Malai
    I activated following code with feature (web level scope). Now when i add an item to any document library it should create a folder "". No folder is created and no error is given either. can anyone see what's is going on? I got the following from the log file. I found similar code all over google so I am kinda puzzled why is not working in my environment. Thanks using System; using System.Collections.Generic; using System.Text; using Microsoft.SharePoint; namespace AddaFolder { class clAddaFolder : SPItemEventReceiver { public override void ItemAdded(SPItemEventProperties properties) { base.ItemAdded(properties); using (SPSite currentSite = new SPSite(SPContext.Current.Site.Url)) using (SPWeb currentWeb = currentSite.OpenWeb(SPContext.Current.Web.Url)) { try { //SPListTemplateCollection coll = currentWeb.ListTemplates; //Get the current document library link SPList newList = currentWeb.GetList(SPContext.Current.Web.Url); //.Site.Url); //newList = currentWeb.Lists.Add("My TEST Folder",SPFileSystemObjectType.Folder); //newList.Lists.Items.Add("My TEST Folder", SPFileSystemObjectType.Folder); //newList.Update(); SPListItem newListItem; //newListItem = newList.Folders.Add("", SPFileSystemObjectType.Folder, "My Test Folder"); newListItem = newList.Folders.Add(newList.ToString(), SPFileSystemObjectType.Folder, "My Test Folder"); newListItem.Update(); } catch (SPException spEx) { throw spEx; } } } } } 04/03/2010 17:52:44.25 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Shared Documents/Forms/AllItems.aspx". 04/03/2010 17:52:44.26 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/My TEST Doc Library/Forms/AllItems.aspx". 04/03/2010 17:52:44.27 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Calendar/calendar.aspx". 04/03/2010 17:52:44.29 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Tasks/AllItems.aspx". 04/03/2010 17:52:44.30 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Team Discussion/AllItems.aspx". 04/03/2010 17:52:44.31 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Shared Documents/Forms/AllItems.aspx". 04/03/2010 17:52:44.32 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/My TEST Doc Library/Forms/AllItems.aspx". 04/03/2010 17:52:44.34 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Calendar/calendar.aspx". 04/03/2010 17:52:44.35 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Tasks/AllItems.aspx". 04/03/2010 17:52:44.36 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Team Discussion/AllItems.aspx". 04/03/2010 17:52:51.33 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Shared Documents/Forms/AllItems.aspx". 04/03/2010 17:52:51.34 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/My TEST Doc Library/Forms/AllItems.aspx". 04/03/2010 17:52:51.35 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Calendar/calendar.aspx". 04/03/2010 17:52:51.37 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Tasks/AllItems.aspx". 04/03/2010 17:52:51.38 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Team Discussion/AllItems.aspx". 04/03/2010 17:52:51.39 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Shared Documents/Forms/AllItems.aspx". 04/03/2010 17:52:51.40 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/My TEST Doc Library/Forms/AllItems.aspx". 04/03/2010 17:52:51.41 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Calendar/calendar.aspx". 04/03/2010 17:52:51.43 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Tasks/AllItems.aspx". 04/03/2010 17:52:51.44 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Team Discussion/AllItems.aspx". 04/03/2010 17:53:02.69 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Shared Documents/Forms/AllItems.aspx". 04/03/2010 17:53:02.71 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/My TEST Doc Library/Forms/AllItems.aspx". 04/03/2010 17:53:02.72 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Calendar/calendar.aspx". 04/03/2010 17:53:02.73 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Tasks/AllItems.aspx". 04/03/2010 17:53:02.74 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Team Discussion/AllItems.aspx". 04/03/2010 17:53:02.75 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Shared Documents/Forms/AllItems.aspx". 04/03/2010 17:53:02.76 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/My TEST Doc Library/Forms/AllItems.aspx". 04/03/2010 17:53:02.77 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Calendar/calendar.aspx". 04/03/2010 17:53:02.78 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Tasks/AllItems.aspx". 04/03/2010 17:53:02.79 w3wp.exe (0x00C0) 0x0C88 Windows SharePoint Services General 8kh7 High There is no Web named "/sites/myDevSiteColl/myDevWeb/Lists/Team Discussion/AllItems.aspx".

    Read the article

  • General Policies and Procedures for Maintaining the Value of Data Assets

    Here is a general list for policies and procedures regarding maintaining the value of data assets. Data Backup Policies and Procedures Backups are very important when dealing with data because there is always the chance of losing data due to faulty hardware or a user activity. So the need for a strategic backup system should be mandatory for all companies. This being said, in the real world some companies that I have worked for do not really have a good data backup plan. Typically when companies tend to take this kind of approach in data backups usually the data is not really recoverable.  Unfortunately when companies do not regularly test their backup plans they get a false sense of security because they think that they are covered. However, I can tell you from personal and professional experience that a backup plan/system is never fully implemented until it is regularly tested prior to the time when it actually needs to be used. Disaster Recovery Plan Expanding on Backup Policies and Procedures, a company needs to also have a disaster recovery plan in order to protect its data in case of a catastrophic disaster.  Disaster recovery plans typically encompass how to restore all of a company’s data and infrastructure back to a restored operational status.  Most Disaster recovery plans also include time estimates on how long each step of the disaster recovery plan should take to be executed.  It is important to note that disaster recovery plans are never fully implemented until they have been tested just like backup plans. Disaster recovery plans should be tested regularly so that the business can be confident in not losing any or minimal data due to a catastrophic disaster. Firewall Policies and Content Filters One way companies can protect their data is by using a firewall to separate their internal network from the outside. Firewalls allow for enabling or disabling network access as data passes through it by applying various defined restrictions. Furthermore firewalls can also be used to prevent access from the internal network to the outside by these same factors. Common Firewall Restrictions Destination/Sender IP Address Destination/Sender Host Names Domain Names Network Ports Companies can also desire to restrict what their network user’s view on the internet through things like content filters. Content filters allow a company to track what webpages a person has accessed and can also restrict user’s access based on established rules set up in the content filter. This device and/or software can block access to domains or specific URLs based on a few factors. Common Content Filter Criteria Known malicious sites Specific Page Content Page Content Theme  Anti-Virus/Mal-ware Polices Fortunately, most companies utilize antivirus programs on all computers and servers for good reason, virus have been known to do the following: Corrupt/Invalidate Data, Destroy Data, and Steal Data. Anti-Virus applications are a great way to prevent any malicious application from being able to gain access to a company’s data.  However, anti-virus programs must be constantly updated because new viruses are always being created, and the anti-virus vendors need to distribute updates to their applications so that they can catch and remove them. Data Validation Policies and Procedures Data validation is very important to ensure that only accurate information is stored. The existence of invalid data can cause major problems when businesses attempt to use data for knowledge based decisions and for performance reporting. Data Scrubbing Policies and Procedures Data scrubbing is valuable to companies in one of two ways. The first can be used to clean data prior to being analyzed for report generation. The second is that it allows companies to remove things like personally Identifiable information from its data prior to transmit it between multiple environments or if the information is sent to an external location. An example of this can be seen with medical records in regards to HIPPA laws that prohibit the storage of specific personal and medical information. Additionally, I have professionally run in to a scenario where the Canadian government does not allow any Canadian’s personal information to be stored on a server not located in Canada. Encryption Practices The use of encryption is very valuable when a company needs to any personal information. This allows users with the appropriated access levels to view or confirm the existence or accuracy of data within a system by either decrypting the information or encrypting a piece of data and comparing it to the stored version.  Additionally, if for some unforeseen reason the data got in to the wrong hands then they would have to first decrypt the data before they could even be able to read it. Encryption just adds and additional layer of protection around data itself. Standard Normalization Practices The use of standard data normalization practices is very important when dealing with data because it can prevent allot of potential issues by eliminating the potential for unnecessary data duplication. Issues caused by data duplication include excess use of data storage, increased chance for invalidated data, and over use of data processing. Network and Database Security/Access Policies Every company has some form of network/data access policy even if they have none. These policies help secure data from being seen by inappropriate users along with preventing the data from being updated or deleted by users. In addition, without a good security policy there is a large potential for data to be corrupted by unassuming users or even stolen. Data Storage Policies Data storage polices are very important depending on how they are implemented especially when a company is trying to utilize them in conjunction with other policies like Data Backups. I have worked at companies where all network user folders are constantly backed up, and if a user wanted to ensure the existence of a piece of data in the form of a file then they had to store that file in their network folder. Conversely, I have also worked in places where when a user logs on or off of the network there entire user profile is backed up. Training Policies One of the biggest ways to prevent data loss and ensure that data will remain a company asset is through training. The practice of properly train employees on how to work with in systems that access data is crucial when trying to ensure a company’s data will remain an asset. Users need to be trained on how to manipulate a company’s data in order to perform their tasks to reduce the chances of invalidating data.

    Read the article

  • SSRS/Sharepoint - Reports made in Report Builder not being list in Sharepoint Web Part

    - by Greg_the_Ant
    I followed the steps here to integrate reporting services with sharepoint in native mode. I made a page in Sharepoint with the report explorer web part and everything is working. The issue is when I create a report with the web based report builder tool, it will show up in the report manager page, but not show up in the report explorer web part on the share point page. New reports I upload using report manager do show up. Does anyone have any ideas? I'm really stuck.

    Read the article

  • Independent SharePoint Trainer in DC ~ I conduct, teacher-led SHAREPOINT user training anywhere ~

    - by technical-trainer-pro
    Your options: "*interactive" hands-on VIRTUAL or CLASSROOM style training to all SharePoint Users & Site Admin owners.* I also develop customized classes tailored to the specific design of any SharePoint Site - acting as the translator for those left to understand and use it, on an everyday basis. Audience: users,clients,stakeholders,trainers Areas: functionality,operations,management, user site customization,ITIL training, governance process,change mangement and industry or client specific scenerios. INDIVIDUAL RATE- $300 to join any class *(1)* GROUP RATE - $1500 for a private group of (6-10) Flexible Scheduling contact me : [email protected] Local to DC/MD/VA ---can train hands-on anywhere~

    Read the article

  • Enhancing, Employing, Engaging SharePoint 2010

    - by Sahil Malik
    SharePoint 2010 Training: more information Recently I completed a recording for the Microsoft Partner Learning Center (PLC) on three topics.The videos are now available for your viewing pleasure, I hope you find them useful.   Enhancing SharePoint 2010 – The development story The various ways to deliver functionality in SharePoint with most of the focus surrounding Visual Studio 2010, and SharePoint Designer 2010 where necessary. Watch Video (https://training.partner.microsoft.com/learning/app/management/LMS_ActDetails.aspx?UserMode=0&ActivityId=732988) Read full article ....

    Read the article

  • Using a site page to find out the web Template Name used in a SharePoint Site

    - by ybbest
    Today, I have created a SharePoint solution. It deploys a site page with code behind to show the web template name used in a SharePoint site. You can download the project from here. After you have deployed the project, you can see your template name from http://[site collection Name]/sitepage/WebTemplateInfo.aspx References: http://blogs.msdn.com/b/kaevans/archive/2010/06/28/creating-a-sharepoint-site-page-with-code-behind-using-visual-studio-2010.aspx http://www.devexpertise.com/2009/02/06/sharepoint-list-template-ids-and-site-template-ids/ http://blog.rafelo.com/2008/05/determining-site-template-used-on.html

    Read the article

  • Macaw DualLayout for SharePoint 2010 WCM released!

    - by svdoever
    A few months ago I wrote a blog post about the DualLayout component we developed for SharePoint Server 2010 WCM. DualLayout enables advanced web design on SharePoint WCM sites. See the blog post DualLayout - Complete HTML freedom in SharePoint Publishing sites! for background information. DualLayout if now available for download. Check out DualLayout for SharePoint 2010 WCM and download your fully functional trial copy! Enjoy the freedom!

    Read the article

  • A great overview of the features of the different SharePoint 2010 editions

    - by svdoever
    The following document gives a good overview of the features available in the different SharePoint editions: Foundation (free), Standard and Enterprise. http://sharepoint.microsoft.com/en-us/buy/pages/editions-comparison.aspx It is good to see the power that is available in the free SharePoint Foundation edition, so there is no reason to not use SharePoint as a foundation for you collaboration applications.

    Read the article

  • Importing Multiple Schemas to a Model in Oracle SQL Developer Data Modeler

    - by thatjeffsmith
    Your physical data model might stretch across multiple Oracle schemas. Or maybe you just want a single diagram containing tables, views, etc. spanning more than a single user in the database. The process for importing a data dictionary is the same, regardless if you want to suck in objects from one schema, or many schemas. Let’s take a quick look at how to get started with a data dictionary import. I’m using Oracle SQL Developer in this example. The process is nearly identical in Oracle SQL Developer Data Modeler – the only difference being you’ll use the ‘File’ menu to get started versus the ‘File – Data Modeler’ menu in SQL Developer. Remember, the functionality is exactly the same whether you use SQL Developer or SQL Developer Data Modeler when it comes to the data modeling features – you’ll just have a cleaner user interface in SQL Developer Data Modeler. Importing a Data Dictionary to a Model You’ll want to open or create your model first. You can import objects to an existing or new model. The easiest way to get started is to simply open the ‘Browser’ under the View menu. The Browser allows you to navigate your open designs/models You’ll see an ‘Untitled_1′ model by default. I’ve renamed mine to ‘hr_sh_scott_demo.’ Now go back to the File menu, and expand the ‘Data Modeler’ section, and select ‘Import – Data Dictionary.’ This is a fancy way of saying, ‘suck objects out of the database into my model’ Connect! If you haven’t already defined a connection to the database you want to reverse engineer, you’ll need to do that now. I’m going to assume you already have that connection – so select it, and hit the ‘Next’ button. Select the Schema(s) to be imported Select one or more schemas you want to import The schemas selected on this page of the wizard will dictate the lists of tables, views, synonyms, and everything else you can choose from in the next wizard step to import. For brevity, I have selected ALL tables, views, and synonyms from 3 different schemas: HR SCOTT SH Once I hit the ‘Finish’ button in the wizard, SQL Developer will interrogate the database and add the objects to our model. The Big Model and the 3 Little Models I can now see ALL of the objects I just imported in the ‘hr_sh_scott_demo’ relational model in my design tree, and in my relational diagram. Quick Tip: Oracle SQL Developer calls what most folks think of as a ‘Physical Model’ the ‘Relational Model.’ Same difference, mostly. In SQL Developer, a Physical model allows you to define partitioning schemes, advanced storage parameters, and add your PL/SQL code. You can have multiple physical models per relational models. For example I might have a 4 Node RAC in Production that uses partitioning, but in test/dev, only have a single instance with no partitioning. I can have models for both of those physical implementations. The list of tables in my relational model Wouldn’t it be nice if I could segregate the objects based on their schema? Good news, you can! And it’s done by default Several of you might already know where I’m going with this – SUBVIEWS. You can easily create a ‘SubView’ by selecting one or more objects in your model or diagram and add them to a new SubView. SubViews are just mini-models. They contain a subset of objects from the main model. This is very handy when you want to break your model into smaller, more digestible parts. The model information is identical across the model and subviews, so you don’t have to worry about making a change in one place and not having it propagate across your design. SubViews can be used as filters when you create reports and exports as well. So instead of generating a PDF for everything, just show me what’s in my ‘ABC’ subview. But, I don’t want to do any work! Remember, I’m really lazy. More good news – it’s already done by default! The schemas are automatically used to create default SubViews Auto-Navigate to the Object in the Diagram In the subview tree node, right-click on the object you want to navigate to. You can ask to be taken to the main model view or to the SubView location. If you haven’t already opened the SubView in the diagram, it will be automatically opened for you. The SubView diagram only contains the objects from that SubView Your SubView might still be pretty big, many dozens of objects, so don’t forget about the ‘Navigator‘ either! In summary, use the ‘Import’ feature to add existing database objects to your model. If you import from multiple schemas, take advantage of the default schema based SubViews to help you manage your models! Sometimes less is more!

    Read the article

  • SharePoint 2010 Hosting - ASPHostPortal :: Installing SSRS 2008 R2 on SharePoint 2010

    - by mbridge
    What do you need first? Please download SQL Server® 2008 R2 November CTP Reporting Services Add-in for Microsoft SharePoint® Technologies 2010 and please follow this steps: 1. Install a SharePoint technology instance. (Already did this when installing PowerPivot with SharePoint) 2. Install SQL Server 2008 R2 November CTP Reporting Services and specify that the report server use SharePoint Integrated mode 3. Configure Reporting Services 4. Download the Reporting Services Add-in by clicking the rsSharePoint.msi link later on this page. To start the installation immediately, click Run After installing Reporting services and the add-in your reporting server is ready to be integrated with SharePoint, in SharePoint 2010 we have some new admin screens. To integrate go to central admin, general application settings: When you successfully installed the add-in a reporting services icon will be there. Click Reporting Services Integration: Add the report server web service url (To get the URL, open the Reporting Services Configuration tool, connect to the report server, and click Web Service URL. Click the URL to verify it works. Copy the URL and paste it into Report Server Web Service URL.), select your authentication mode (windows authentication is prefered). Add a username and password of your admin account. Click ok to configure and start the integration. After the installation you can set the reporting services default. What is changed in SP2010 is that there isn’t a report library available. You have to add content types to a default library. So go to a site collection, site actions, View all site content. Create a Asset library: Now we have to make sure we can add reports to the library. To do this we have to add content types: Open the library, click on library tools, library settings, Under Content Types, click Add from existing site content types. In the Select Content Types section, in Select site content types from, click the arrow to select Reporting Services. In the Available Site Content Types list, click Report Builder, Report Data Source and Report and then click Add to move the selected content type to the Content types to add list. Now we are ready to upload reports and execute them from within our webparts: Another interesting post: - Integrating SharePoint 2010 and SQL 2008 R2

    Read the article

  • Large files in SharePoint 2010

    - by Sahil Malik
    SharePoint 2010 Training: more information Hoooorayy! My latest code-mag article is finally online. This is an article I’ve been wanting to write for a while now – there is just so much goo in the world around large file management in SharePoint. So I thought an article that sums up the things you need to consider for large file management projects in SharePoint was in order. Anyway, here is the article, enjoy Read full article ....

    Read the article

  • New Article: The 12-Step Recovery Program from a SharePoint Error

    - by Sahil Malik
    SharePoint 2010 Training: more information Nice!! I had been waiting for this article to come online.In this article, I describe 12 steps that will let you sort out pretty much any SharePoint error there is. Here is a starting excerpt -- Hello, my name is Sahil, and I am a worsening SharePointoholic. SharePoint is built on ASP.NET 2.0 - pretty much like human beings are made up of carbon and water. There is a lot in SharePoint that isn’t in ASP.NET. Not only is SharePoint a complex ASP.NET 2.0 application, it also has numerous concepts for things such as profiles, role providers, authorization etc., that are different from ASP.NET…… Read the rest … Read full article ....

    Read the article

  • Exposing an MVC Application Through SharePoint

    - by Damon
    Below you will find my presentation slides and demo files for my SharePoint TechFest 2010 presentation on Exposing an MVC Application through SharePoint.  One of the points I forgot to mention goes back to the performance and licensing benefits of this approach.  If you have a SharePoint box that is completely slammed, you can put the MVC application on a separate web server and essentially offload the application processing to another server.  In terms of licensing, you can leave SharePoint off that new server and just access SharePoint data via web services from the box.  This makes it a lot cheaper if you have MOSS - but if you're just running WSS then it may not have as many cost benefits.  Remember, programming against the web services is not always the easiest thing, so you have to weight the cost/benefit ratio when making such a determination.

    Read the article

  • Deploying Data Mining Models using Model Export and Import, Part 2

    - by [email protected]
    In my last post, Deploying Data Mining Models using Model Export and Import, we explored using DBMS_DATA_MINING.EXPORT_MODEL and DBMS_DATA_MINING.IMPORT_MODEL to enable moving a model from one system to another. In this post, we'll look at two distributed scenarios that make use of this capability and a tip for easily moving models from one machine to another using only Oracle Database, not an external file transport mechanism, such as FTP. The first scenario, consider a company with geographically distributed business units, each collecting and managing their data locally for the products they sell. Each business unit has in-house data analysts that build models to predict which products to recommend to customers in their space. A central telemarketing business unit also uses these models to score new customers locally using data collected over the phone. Since the models recommend different products, each customer is scored using each model. This is depicted in Figure 1.Figure 1: Target instance importing multiple remote models for local scoring In the second scenario, consider multiple hospitals that collect data on patients with certain types of cancer. The data collection is standardized, so each hospital collects the same patient demographic and other health / tumor data, along with the clinical diagnosis. Instead of each hospital building it's own models, the data is pooled at a central data analysis lab where a predictive model is built. Once completed, the model is distributed to hospitals, clinics, and doctor offices who can score patient data locally.Figure 2: Multiple target instances importing the same model from a source instance for local scoring Since this blog focuses on model export and import, we'll only discuss what is necessary to move a model from one database to another. Here, we use the package DBMS_FILE_TRANSFER, which can move files between Oracle databases. The script is fairly straightforward, but requires setting up a database link and directory objects. We saw how to create directory objects in the previous post. To create a database link to the source database from the target, we can use, for example: create database link SOURCE1_LINK connect to <schema> identified by <password> using 'SOURCE1'; Note that 'SOURCE1' refers to the service name of the remote database entry in your tnsnames.ora file. From SQL*Plus, first connect to the remote database and export the model. Note that the model_file_name does not include the .dmp extension. This is because export_model appends "01" to this name.  Next, connect to the local database and invoke DBMS_FILE_TRANSFER.GET_FILE and import the model. Note that "01" is eliminated in the target system file name.  connect <source_schema>/<password>@SOURCE1_LINK; BEGIN  DBMS_DATA_MINING.EXPORT_MODEL ('EXPORT_FILE_NAME' || '.dmp',                                 'MY_SOURCE_DIR_OBJECT',                                 'name =''MY_MINING_MODEL'''); END; connect <target_schema>/<password>; BEGIN  DBMS_FILE_TRANSFER.GET_FILE ('MY_SOURCE_DIR_OBJECT',                               'EXPORT_FILE_NAME' || '01.dmp',                               'SOURCE1_LINK',                               'MY_TARGET_DIR_OBJECT',                               'EXPORT_FILE_NAME' || '.dmp' );  DBMS_DATA_MINING.IMPORT_MODEL ('EXPORT_FILE_NAME' || '.dmp',                                 'MY_TARGET_DIR_OBJECT'); END; To clean up afterward, you may want to drop the exported .dmp file at the source and the transferred file at the target. For example, utl_file.fremove('&directory_name', '&model_file_name' || '.dmp');

    Read the article

< Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >