Search Results

Search found 36081 results on 1444 pages for 'site studio'.

Page 110/1444 | < Previous Page | 106 107 108 109 110 111 112 113 114 115 116 117  | Next Page >

  • What is a proper way to store site-level global variables in a SharePoint site?

    - by ccomet
    One thing that has driven me nuts about SharePoint2007 is the apparent inability to have defineable settings that apply specifically to a site or site collection itself, and not the content. I mean, you have some pre-defined settings like the Site Logo, the Site Name, and various other things, but there doesn't appear to be anywhere to add new kinds of settings. The application I am working on needs to be able to create multiple kinds of "project site collections" that all follow a basic template, but have certain additional settings that apply specifically to that site collection and that one alone. In addition to the standard site name we also need to define the Project Number, the Project Name, and the Client Name. And given the requests of some of our clients, we also reach a point where we have to have configurable settings that alter how some of the workflows work, like whether files are marked with Letters or Numbers. Our current solution, which I'm hesitant about, has been to store an XML file on the SharePoint server. This file contains one node for each site collection, identified by the URL of the root site. Inside the node are all of the elements that need to be defined for that site collection. When we need them, we have to access the XML file (which will always require SPSecurity.RunWithElevatedPrivileges to access files right on the server) every time to load it and retrieve the data. There are a lot of automated processes which will have to do this, and I'm hesitant about the stability of this method when we reach hundreds of sites with thousands of files running tens of thousands of workflows, all wanting to access this file. Maybe they're unfounded worries, but I'd rather worry than risk everything breaking in a couple years. I was looking into the SPWeb object and found the AllProperties hashtable. It looks like just the kind of thing which might work, but I don't know how safe it is to be modifying this. I read through both MSDN and the WSS SDK but found nothing that clarified on adding completely new properties into AllProperties. Is it safe to use AllProperties for this kind of thing? Or is there yet another feature that I am missing, which could handle the concept of global variables at the site collection or site scope?

    Read the article

  • Extend legacy site with another server-side programming platform best practice

    - by Andrew Florko
    Company I work for have a site developed 6-8 years ago by a team that was enthusiastic enough to use their own private PHP-based CMS. I have to put dynamic data from one intranet company database on this site in one week: 2-3 pages. I contacted company site administrator and she showed me administrative part - CMS allows only to insert html blocks & manage site map (site is deployed on machine that is inside company & fully accessible & upgradeable). I'm not a PHP-guy & I don't want to dive into legacy hardly-who-ever-heard-about CMS engine I also don't want to contact developers team, 'cos I'm not sure they are still present and capable enough to extend this old days site and it'll take too much time anyway. I am about to deploy helper asp.net site on IIS with 2-3 pages required & refer helper site via iframe from present site. New pages will allow to download some dynamic content from present site also. Is it ok and what are the pitfalls with iframe approach?

    Read the article

  • Does sitewide html refactoring affect Google traffic?

    - by Name
    Good morning, I have recently made a big structural change on my site and the very next day the number of Google impressions went from 75.000 to 3.000, with a proportional drop of traffic from searches. No URLs were changed, neither were the page titles or descriptions. Everything is exactly the same, but different looking, except that it does barely appear on Google anymore. Anybody has a clue to why?

    Read the article

  • Abnormal alexa ranking score

    - by SteenhouwerD
    I have 2 websites of ecommerce in France and for these websites I have a strange behaviour of the alexa results. Here are some statistics about the websites : Unique Visits January 2012 Website A : 158,828 Website B : 58,867 Number of Search Results google Website A : 5,100 Website B : 56,000 Links to my site Website A : 3,120 Website B : 2,180 ALEXA Score Website A : 405,804 Website B : 278,944 How does it come that website B with 1/3 of the visitors of website A have a much better Alexa Score ( x2 ) then website A?

    Read the article

  • Only one sharepoint site is not reachable

    - by Kabir Rao
    We are facing a strange issue since morning...only one sharepoint site is not accessible. In our sharepoint 2003 server there are several sites configured for separate projects. Several IIS Reset has been done, everyother site is accesible except one. When I try to map the site as network drive, i am able to access the contents of site. Please help asap. Thanks upfront.

    Read the article

  • How can I set up a dual-site Storage Daemon in Bacula (mirror the backup)

    - by Andy
    On site A, I have sucessfully set up a bacula director on one host, several File Daemons on the hosts I want to backup, and finally one Storage Daemon where the backup actually is stored. If disaster struck the building Site A, I want a second Storage Daemon on another site, Site B. The Filesets, Director etc would be the same, except the jobs will be stored on the other Storage Daemon as well. Are there any best practises on this?

    Read the article

  • Copy CakePHP site under IIS webroot?

    - by dan giz
    I've got a CakePHP application that uses a MSSql server running on windows server 2008r2 enterprise using IIS 7.5 My application is the only website running and is installed into wwwroot with cakephp installed to c:\inetpub (one level up from the site) I want to copy this site so i can have a development version to work on, without conflicting with the live site. how do i go about doing this? i'm confused, since previous set ups i've seen have a different folder in wwwroot for the site itself.

    Read the article

  • Why is my site not on Google? [closed]

    - by RD
    I wanted to post a link here, but some people might see that as advertising. So, instead I'm going to phrase my question like this: What can I do, to make sure my site appears on Google? I have already done the following: Submitted my sitemap Added my site at www.google.com/addurl Added Analytics to my site Checked in the webmaster tools if there are crawlers errors But still, after about three or four days, the crawler hasn't crawled my site. What am I missing?

    Read the article

  • Using local DNS and public DNS during site development

    - by ChrisFM
    I'm a web designer and I often develop new sites for existing businesses. Sometimes I find it useful to point my DNS address (for my personal computer) to the development servers local DNS (instead of Googles 8.8.8.8 or the default isp's address). I like this as it let's me see that the new site's internal links, etc, operate before switching over the authorative DNS from the old site. However outgoing links (Like a Google map) would not route with my local dns. The first thing I thought was, oh I just need to fill out my DNS directory with Google and every other domain I might need to link to, wait... that sounds insane. I was wondering if anyone could give me insight into a better way or more functional way to get use local DNS addresses when they're available and public supplied DNS address when they are not? Kinda like a DNS to 'roll-over'? Or maybe a completely different approach to development all together. Thanks in advance for your insight. All the best!

    Read the article

  • SQL SERVER – 5 Tips for Improving Your Data with expressor Studio

    - by pinaldave
    It’s no secret that bad data leads to bad decisions and poor results.  However, how do you prevent dirty data from taking up residency in your data store?  Some might argue that it’s the responsibility of the person sending you the data.  While that may be true, in practice that will rarely hold up.  It doesn’t matter how many times you ask, you will get the data however they decide to provide it. So now you have bad data.  What constitutes bad data?  There are quite a few valid answers, for example: Invalid date values Inappropriate characters Wrong data Values that exceed a pre-set threshold While it is certainly possible to write your own scripts and custom SQL to identify and deal with these data anomalies, that effort often takes too long and becomes difficult to maintain.  Instead, leveraging an ETL tool like expressor Studio makes the data cleansing process much easier and faster.  Below are some tips for leveraging expressor to get your data into tip-top shape. Tip 1:     Build reusable data objects with embedded cleansing rules One of the new features in expressor Studio 3.2 is the ability to define constraints at the metadata level.  Using expressor’s concept of Semantic Types, you can define reusable data objects that have embedded logic such as constraints for dealing with dirty data.  Once defined, they can be saved as a shared atomic type and then re-applied to other data attributes in other schemas. As you can see in the figure above, I’ve defined a constraint on zip code.  I can then save the constraint rules I defined for zip code as a shared atomic type called zip_type for example.   The next time I get a different data source with a schema that also contains a zip code field, I can simply apply the shared atomic type (shown below) and the previously defined constraints will be automatically applied. Tip 2:     Unlock the power of regular expressions in Semantic Types Another powerful feature introduced in expressor Studio 3.2 is the option to use regular expressions as a constraint.   A regular expression is used to identify patterns within data.   The patterns could be something as simple as a date format or something much more complex such as a street address.  For example, I could define that a valid IP address should be made up of 4 numbers, each 0 to 255, and separated by a period.  So 192.168.23.123 might be a valid IP address whereas 888.777.0.123 would not be.   How can I account for this using regular expressions? A very simple regular expression that would look for any 4 sets of 3 digits separated by a period would be:  ^[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}$ Alternatively, the following would be the exact check for truly valid IP addresses as we had defined above:  ^(25[0-5]|2[0-4][0-9]|1[0-9]{2}|[1-9]?[0-9])\.(25[0-5]|2[0-4][0-9]|1[0-9]{2}|[1-9]?[0-9])\.(25[0-5]|2[0-4][0-9]|1[0-9]{2}|[1-9]?[0-9])\.(25[0-5]|2[0-4][0-9]|1[0-9]{2}|[1-9]?[0-9])$ .  In expressor, we would enter this regular expression as a constraint like this: Here we select the corrective action to be ‘Escalate’, meaning that the expressor Dataflow operator will decide what to do.  Some of the options include rejecting the offending record, skipping it, or aborting the dataflow. Tip 3:     Email pattern expressions that might come in handy In the example schema that I am using, there’s a field for email.  Email addresses are often entered incorrectly because people are trying to avoid spam.  While there are a lot of different ways to define what constitutes a valid email address, a quick search online yields a couple of really useful regular expressions for validating email addresses: This one is short and sweet:  \b[A-Z0-9._%+-]+@[A-Z0-9.-]+\.[A-Z]{2,4}\b (Source: http://www.regular-expressions.info/) This one is more specific about which characters are allowed:  ^([a-zA-Z0-9_\-\.]+)@((\[[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.)|(([a-zA-Z0-9\-]+\.)+))([a-zA-Z]{2,4}|[0-9]{1,3})(\]?)$ (Source: http://regexlib.com/REDetails.aspx?regexp_id=26 ) Tip 4:     Reject “dirty data” for analysis or further processing Yet another feature introduced in expressor Studio 3.2 is the ability to reject records based on constraint violations.  To capture reject records on input, simply specify Reject Record in the Error Handling setting for the Read File operator.  Then attach a Write File operator to the reject port of the Read File operator as such: Next, in the Write File operator, you can configure the expressor operator in a similar way to the Read File.  The key difference would be that the schema needs to be derived from the upstream operator as shown below: Once configured, expressor will output rejected records to the file you specified.  In addition to the rejected records, expressor also captures some diagnostic information that will be helpful towards identifying why the record was rejected.  This makes diagnosing errors much easier! Tip 5:    Use a Filter or Transform after the initial cleansing to finish the job Sometimes you may want to predicate the data cleansing on a more complex set of conditions.  For example, I may only be interested in processing data containing males over the age of 25 in certain zip codes.  Using an expressor Filter operator, you can define the conditional logic which isolates the records of importance away from the others. Alternatively, the expressor Transform operator can be used to alter the input value via a user defined algorithm or transformation.  It also supports the use of conditional logic and data can be rejected based on constraint violations. However, the best tip I can leave you with is to not constrain your solution design approach – expressor operators can be combined in many different ways to achieve the desired results.  For example, in the expressor Dataflow below, I can post-process the reject data from the Filter which did not meet my pre-defined criteria and, if successful, Funnel it back into the flow so that it gets written to the target table. I continue to be impressed that expressor offers all this functionality as part of their FREE expressor Studio desktop ETL tool, which you can download from here.  Their Studio ETL tool is absolutely free and they are very open about saying that if you want to deploy their software on a dedicated Windows Server, you need to purchase their server software, whose pricing is posted on their website. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Nuevo Video del Curso Introducción a C# con Visual Studio 2012

    - by carlone
    Estimad@s Amig@s, Ya se encuentra publicado un Nuevo video del curso Introducción a C# con Visual Studio 2012.  13:3211WATCHEDIntroducción a C# con Visual Studio 2012: Estructuras Cíclicas (Bucle For)by Carlos Lone 35 viewsEn este video daremos una introducción al concepto de las estructuras cíclicas y aprenderemos a utilizar el Bucle For  El código de los ejemplos utilizados pueden descargarlos en https://latamcsharpvs2012.codeplex.com/ Saludos, Carlos A. Lone  

    Read the article

  • A Brief Discussion On Visual Studio 2010 Top Features

    In this article I will describe about some new features of Visual Studio 2010 which I explored till now. These features are really very useful in terms of productive development. This article is mainly targeted for beginners of Visual Studio 2010 but everybody can get benefit on the same.

    Read the article

  • Google Webmasters Tools strange 404 errors referred from same site

    - by Out of Control
    Starting about a month ago, I noticed a sudden increase in 404 errors in Webmasters Tools for one of my sites (over 1400 errors so far). All the errors are being referred from my own site to non existent pages. The 404 error URLs are all of the same format: URL: http://www.helloneighbour.com/save/1347208508000 The number on the end appears to be a timestamp followed by 3 zeros. The referring page, in this case is : Linked from http://www.helloneighbour.com/save/cmw-insurance-insurance-burnaby When I look at the source code of that page, or I use Webmaster tools to view the page as Google sees it, I can't find any link that comes close to what is above. I built the site, and I can't find any place that might be causing these false links either. The server logs (access and error) don't show Google or anyone else trying to access these links. I've marked all these pages as fixed, and waited a couple of weeks, only to find the errors come back again over the last few days. I'm wondering if anyone else has seen anything strange like this, or if someone might have a way for me to debug, replicate this error myself.

    Read the article

  • Code refactoring with Visual Studio 2010 Part-1

    - by Jalpesh P. Vadgama
    Visual studio 2010 is a Great IDE(Integrated Development Environment) and we all are using it in day by day for our coding purpose. There are many great features provided by Visual Studio 2010 and Today I am going to show one of great feature called for code refactoring. This feature is one of the most unappreciated features of Visual Studio 2010 as lots of people still not using that and doing stuff manfully. So to explain feature let’s create a simple console application which will print first name and last name like following. And following is code for that. using System; namespace CodeRefractoring { class Program { static void Main(string[] args) { string firstName = "Jalpesh"; string lastName = "Vadgama"; Console.WriteLine(string.Format("FirstName:{0}",firstName)); Console.WriteLine(string.Format("LastName:{0}", lastName)); Console.ReadLine(); } } } So as you can see this is a very basic console application and let’s run it to see output. So now lets explore our first feature called extract method in visual studio you can also do that via refractor menu like following. Just select the code for which you want to extract method and then click refractor menu and then click extract method. Now I am selecting three lines of code and clicking on refactor –> Extract Method just like following. Once you click menu a dialog box will appear like following. As you can I have highlighted two thing first is Method Name where I put Print as Method Name and another one Preview method signature where its smart enough to extract parameter also as We have just selected three lines with  console.writeline.  One you click ok it will extract the method and you code will be like this. using System; namespace CodeRefractoring { class Program { static void Main(string[] args) { string firstName = "Jalpesh"; string lastName = "Vadgama"; Print(firstName, lastName); } private static void Print(string firstName, string lastName) { Console.WriteLine(string.Format("FirstName:{0}", firstName)); Console.WriteLine(string.Format("LastName:{0}", lastName)); Console.ReadLine(); } } } So as you can see in above code its has created a static method called Print and also passed parameter for as firstname and lastname. Isn’t that great!!!. It has also created static print method as I am calling it from static void main.  Hope you liked it.. Stay tuned for more..Till that Happy programming.

    Read the article

  • License name in startup in Visual studio 2010

    - by anirudha
    Whenever we install Visual studio in our system. we found that Express edition and visual studio never show our name in startup by default they show Microsoft. here is a way to change them with your name or organization name if you want. HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Windows NT\CurrentVersion go their and check the value for RegisteredOrganization for changing orgranization name in bottom of username or change the RegisteredOwner for changing the name of user.

    Read the article

  • VS 2010 ALM Whitepapers &ndash; link from Neno Loje

    - by johndoucette
    Overview of Visual Studio ALM Whitepapers by Microsoft Overview Visual Studio 2010 Quick Reference Guidance Installation, Configuration & Administration Team Foundation Installation Guide for Visual Studio Team System 2010 Administration Guide for Microsoft Visual Studio 2010 Team Foundation Server Visual Studio 2010 TFS Upgrade Guide Visual Studio 2010 and Team Foundation Server 2010 VM Factory TFS Integration Platform Visual Studio 2010 Licensing White Paper Requirements Visual Studio 2010 Team Foundation Server Requirements Management Guidance Version Control & Configuration Management Visual Studio TFS Branching Guide 2010 Some guide or whitepaper missing? Let me know!

    Read the article

  • How to Create Custom SharePoint Workflows in Visual Studio 2008

    Whereas simple workflows are possible using Microsoft Office SharePoint Designer, you will soon reach the point where you will need to use Visual Studio. In the third article in Charles' introduction to Workflows in Sharepoint, he demonstrates how to create a workflow from scratch using Visual Studio, and discusses the relative merits of the two tools for this sort of development work.

    Read the article

  • MS Bing web crawler out of control causing our site to go down

    - by akaDanPaul
    Here is a weird one that I am not sure what to do. Today our companies e-commerce site went down. I tailed the production log and saw that we were receiving a ton of request from this range of IP's 157.55.98.0/157.55.100.0. I googled around and come to find out that it is a MSN Web Crawler. So essentially MS web crawler overloaded our site causing it not to respond. Even though in our robots.txt file we have the following; Crawl-delay: 10 So what I did was just banned the IP range in iptables. But what I am not sure to do from here is how to follow up. I can't find anywhere to contact Bing about this issue, I don't want to keep those IPs blocked because I am sure eventually we will get de-indexed from Bing. And it doesn't really seem like this has happened to anyone else before. Any Suggestions? Update, My Server / Web Stats Our web server is using Nginx, Rails 3, and 5 Unicorn workers. We have 4gb of memory and 2 virtual cores. We have been running this setup for over 9 months now and never had an issue, 95% of the time our system is under very little load. On average we receive 800,000 page views a month and this never comes close to bringing / slowing down our web server. Taking a look at the logs we were receiving anywhere from 5 up to 40 request / second from this IP range. In all my years of web development I have never seen a crawler hit a website so many times. Is this new with Bing?

    Read the article

  • MSSQL: Copying data from one database to another

    - by DigiMortal
    I have database that has data imported from another server using import and export wizard of SQL Server Management Studio. There is also empty database with same tables but it also has primary keys, foreign keys and indexes. How to get data from first database to another? Here is the description of my crusade. And believe me – it is not nice one. Bugs in import and export wizard There is some awful bugs in import and export wizard that makes data imports and exports possible only on very limited manner: wizard is not able to analyze foreign keys, wizard wants to create tables always, whatever you say in settings. The result is faulty and useless package. Now let’s go step by step and make things work in our scenario. Database There are two databases. Let’s name them like this: PLAIN – contains data imported from remote server (no indexes, no keys, no nothing, just plain dumb data) CORRECT – empty database with same structure as remote database (indexes, keys and everything else but no data) Our goal is to get data from PLAIN to CORRECT. 1. Create import and export package In this point we will create faulty SSIS package using SQL Server Management Studio. Run import and export wizard and let it create SSIS package that reads data from CORRECT and writes it to, let’s say, CORRECT-2. Make sure you enable identity insert. Make sure there are no views selected. Make sure you don’t let package to create tables (you can miss this step because it wants to create tables anyway). Save package to SSIS. 2. Modify import and export package Now let’s clean up the package and remove all faulty crap. Connect SQL Server Management Studio to SSIS instance. Select the package you just saved and export it to your hard disc. Run Business Intelligence Studio. Create new SSIS project (DON’T MISS THIS STEP). Add package from disc as existing item to project and open it. Move to Control Flow page do one of following: Remove all preparation SQL-tasks and connect Data Flow tasks. Modify all preparation SQL-tasks so the existence of tables is checked before table is created (yes, you have to do it manually). Add new Execute-SQL task as first task in control flow: Open task properties. Assign destination connection as connection to use. Insert the following SQL as command:   EXEC sp_MSForEachTable 'ALTER TABLE ? NOCHECK CONSTRAINT ALL' GO   EXEC sp_MSForEachTable 'DELETE FROM ?' GO   Save task. Add new Execute-SQL task as last task in control flow: Open task properties. Assign destination connection as connection to use. Insert the following SQL as command:   EXEC sp_MSForEachTable 'ALTER TABLE ? CHECK CONSTRAINT ALL' GO   Save task Now connect first Execute-SQL task with first Data Flow task and last Data Flow task with second Execute-SQL task. Now move to Package Explorer tab and change connections under Connection Managers folder. Make source connection to use database PLAIN. Make destination connection to use database CORRECT. Save package and rebuilt the project. Update package using SQL Server Management Studio. Some hints: Make sure you take the package from solution folder because it is saved there now. Don’t overwrite existing package. Use numeric suffix and let Management Studio to create a new version of package. Now you are done with your package. Run it to test it and clean out all the errors you find. TRUNCATE vs DELETE You can see that I used DELETE FROM instead of TRUNCATE. Why? Because TRUNCATE has some nasty limits (taken from MSDN): “You cannot use TRUNCATE TABLE on a table referenced by a FOREIGN KEY constraint; instead, use DELETE statement without a WHERE clause. Because TRUNCATE TABLE is not logged, it cannot activate a trigger. TRUNCATE TABLE may not be used on tables participating in an indexed view.” As I am not sure what tables you have and how they are used I provided here the solution that should work for all scenarios. If you need better performance then in some cases you can use TRUNCATE table instead of DELETE. Conclusion My conclusion is bitter this time although I am very positive guy. It is A.D. 2010 and still we have to write stupid hacks for simple things. Simple tools that existed before are long gone and we have to live mysterious bloatware that is our only choice when using default tools. If you take a look at the length of this posting and the count of steps I had to do for one easy thing you should treat it as a signal that something has went wrong in last years. Although I got my job done I would be still more happy if out of box tools are more intelligent one day. References T-SQL Trick for Deleting All Data in Your Database (Mauro Cardarelli) TRUNCATE TABLE (MSDN Library) Error Handling in SQL 2000 – a Background (Erland Sommarskog) Disable/Enable Foreign Key and Check constraints in SQL Server (Decipher)

    Read the article

  • Site in subdomain (MaraDNS + Nginx)

    - by Grzegorz
    Welcome, Actually I'm doing some experiments on my VPS with Ubuntu. I've installed MaraDNS with Nginx. At this moment I've correctly launch static site which is available from Internet (maindomain.com). In next step I want to add new site which will be available in subdomain, for example dev.maindomain.com. I've tried to db.maindomain.com file (used by MaraDNS): maindomain.com. xxx.xxx.xxx.xxx www.maindomain.com. CNAME maindomain.com. dev.maindomain.com. xxx.xxx.xxx.xxx Where xxx.xxx.xxx.xxx is VPS IP address. In nginx.conf I have: server { listen 80; server_name maindomain.com; access_log /var/log/nginx/maindomain.com.log location / { root /var/www/maindomain.com; index index.html; } } server { listen 80; server_name dev.maindomain.com; access_log /var/log/nginx/dev.maindomain.com.log location / { root /var/www/dev.maindomain.com; index index.html; } } With this configuration maindomain.com works properly, but dev.maindomain.com isn't available. When I try: ping dev.maindomain.com then I get my xxx.xxx.xxx.xxx IP. Do you have any suggestions how can I resolve this problem?

    Read the article

< Previous Page | 106 107 108 109 110 111 112 113 114 115 116 117  | Next Page >