Search Results

Search found 8279 results on 332 pages for 'django permissions'.

Page 240/332 | < Previous Page | 236 237 238 239 240 241 242 243 244 245 246 247  | Next Page >

  • WSS 3.0 to SharePoint 2010: Tips for delaying the Visual Upgrade

    - by Kelly Jones
    My most recent project has been to migrate a bunch of sites from WSS 3.0 (SharePoint 2007) to SharePoint Server 2010.  The users are currently working with WSS 3.0 and Office 2003, so the new ribbon based UI in 2010 will be completely new.  My client wants to avoid the new SharePoint 2010 look and feel until they’ve had time to train their users, so we’ve been testing the upgrades by keeping them with the 2007 user interface. Permission to perform the Visual Upgrade One of the first things we noticed was the default permissions for who was allowed to switch the UI from 2007 to 2010.  By default, site collection administrators and site owners can do this.  Since we wanted to more tightly control the timing of the new UI, I added a few lines to the PowerShell script that we are using to perform the migration.  This script creates the web application, sets the User Policy, and then does a Mount-SPDatabase to attach the old 2007 content database to the 2010 farm.  I added the following steps after the Mount-SPDatabase step: #Remove the visual upgrade option for site owners # it remains for Site Collection administrators foreach ($sc in $WebApp.Sites){ foreach ($web in $sc.AllWebs){ #Visual Upgrade permissions for the site/subsite (web) $web.UIversionConfigurationEnabled = $false; $web.Update(); } } These script steps loop through each Site Collection in a particular web application ($WebApp) and then it loops through each subsite ($web) in the Site Collection ($sc) and disables the Site Owner’s permission to perform the Visual Upgrade. This is equivalent to going to the Site Collection administrator settings page –> Visual Upgrade and selecting “Hide Visual Upgrade”. Since only IT people have Site Collection administrator privileges, this will allow IT to control the timing of the new 2010 UI rollout. Newly created subsites Our next issue was brought to our attention by SharePoint Joel’s blog post last week (http://www.sharepointjoel.com/Lists/Posts/Post.aspx?ID=524 ).  In it, he lists some updates about the 2010 upgrade, and his fourth point was one that I hadn’t seen yet: 4. If a 2007 upgraded site has not been visually upgraded, the sites created underneath it will look like 2010 sites – While this is something I’ve been aware of, I think many don’t realize how this impacts common look and feel for master pages, and how it impacts good navigation and UI. As well depending on your patch level you may see hanging behavior in the list picker. The site and list creation Silverlight control in Internet Explorer is looking for resources that don’t exist in the galleries in the 2007 site, and hence it continues to spin and spin and eventually time out. The work around is to upgrade to SP1, or use Chrome or Firefox which won’t attempt to render the Silverlight control. When the root site collection is a 2007 site and has it’s set of galleries and the children are 2010 sites there is some strange behavior linked to the way that the galleries work and pull from the parent. Our production SharePoint 2010 Farm has SP1 installed, as well as the December 2011 Cumulative Update, so I think the “hanging behavior” he mentions won’t affect us. However, since we want to control the roll out of the UI, we are concerned that new subsites will have the 2010 look and feel, no matter what the parent site has. Ok, time to dust off my developer skills. I first looked into using feature stapling, but I couldn’t get that to work (although I’m pretty sure I had everything wired up correctly).  Then I stumbled upon SharePoint 2010’s web events – a great way to handle this. Using Visual Studio 2010, I created a new SharePoint project and added a Web Event Receiver: In the Event Receiver class, I used the WebProvisioned method to check if the parent site is a 2007 site (UIVersion = 3), and if so, then set the newly created site to 2007:   /// <summary> /// A site was provisioned. /// </summary> public override void WebProvisioned(SPWebEventProperties properties) { base.WebProvisioned(properties);   try { SPWeb curweb = properties.Web;   if (curweb.ParentWeb != null) {   //check if the parent website has the 2007 look and feel if (curweb.ParentWeb.UIVersion == 3) { //since parent site has 2007 look and feel // we'll apply that look and feel to the current web curweb.UIVersion = 3; curweb.Update(); } } } catch (Exception) { //TODO: Add logging for errors } }   This event is part of a Feature that is scoped to the Site Level (Site Collection).  I added a couple of lines to my migration PowerShell script to activate the Feature for any site collections that we migrate. Plan Going Forward The plan going forward is to perform the visual upgrade after the users for a particular site collection have gone through 2010 training. If we need to do several site collections at once, we’ll use a PowerShell script to loop through each site collection to update the sites to 2010.  If it’s just one or two, we’ll be using the “Update All Sites” button on the Visual Upgrade page for Site Collection Administrators. The custom code for newly created sites won’t need to be changed, since it relies on the UI version of the parent site.  If the parent is 2010, then the new site will look 2010.

    Read the article

  • VMware sharing folders between Win7 (host) and Ubuntu (guest)

    - by Pawel Goscicki
    What is the best way to share a folder between Win7 64bit (host) and Ubuntu 10.10 (guest) in VMware player? I can setup the sharing just fine (using vmware-tools), but all shared files are root:root owned with 777 permission set. Which, well, sucks. What can I do to have shared files with preserved ownership and permissions? I'm guessing I would need some kind of a file container, that would get mounted in Ubuntu as a block device (if so, it would need to be dynamic, i.e. expand with size of contained files). But maybe there is a better solution?

    Read the article

  • Running an old version of some software

    - by Mark Oak
    I don't want to mingle in any backstory, but all that needs to be known is that I have a computer with Ubuntu on it and I am trying to install Windows 8 from an ISO. I am using the guide that can be found here which is a little more than four years old. Now, I've been able to accomplish everything up to Step 2, at which point I am stuck. I have downloaded the file found on that page, which can be found here, and have attempted to use it, as directed, quote; "right click the downloaded Unetbootin file, select Properties and on the "Permissions" tab, check the "Allow executing file as program" box. Then simply double click it and it should open." But, after having set checked the specified box and double clicking the file, nothing happens. Nothing is launched and nothing changes. I've been stuck here for several hours now, having failed to find a solution via Google.

    Read the article

  • Ethernet on an Acer Aspire One D255E not working

    - by Dustin
    Ok very frustrated here, but hopefully can find an answer here. I have an Acer Aspire One netbook D255E. I installed 10.04 desktop edition via live USB, only to find out ethernet is not connecting. The thing is, I had installed previously 10.04 thru wubi as 30GB, but wanted more so did this install. With the Wubi install it worked great internet worked added LibO, mint menu ect. So I know it could work, and it works on Win 7(what little GB is left). I have ethernet enabled, and gave myself all permissions, but still not working. Thank you for any help, and if you do more details in the steps i need to take the better :).

    Read the article

  • CSV files not being written

    - by Kamalpreet
    I worked on a small project in which the data entered in HTML forms was saved in a CSV file which was used subsequently. The files were run on Apache2. It worked fine. After about 25 days, when I reopened the project, the data entered in forms was not saved in CSV file. I checked all the permissions. I even sent a zip file of my files to one of m friends. It worked well on his system. So should I figure it out there's some problem in the system. I am using Ubuntu 13.04. Kindly suggest me something so that I am able to figure out the problem.

    Read the article

  • sudo & redirect output

    - by Khaled
    I have a small question regarding using sudo with output redirect >. To enable IP forwrding, someone can use the command: $ echo 1 > /proc/sys/net/ipv4/ip_forward Executing this command will give permission denied as it requires root privileges. However, executing the same command with sudo gives also permission denied error! It seems that output redirect > does not inherit the permissions of the preceding command echo. Is this right? As a workaround I do: $ echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward Is this the best way to do it? Am I missing something? Please, note that this is an example and it applies to all commands that use output redirect.

    Read the article

  • How can I cleanup a botched Truecrypt installation?

    - by Don F
    I made the mistake of downloading the X64 version of Truecrypt and trying to install it when I'm actually running the 32-bit version of Precise Pangolin. I want to clean up the files that I am unable to use, but of course I can't just run the uninstall since Truecrypt could not be installed in the first place. I am new to this but I have spent some time researching the command line. When I run "locate trucrypt -i" in the terminal I receive several relevant files in the usr/bin and usr/share directories. No "rm" commands work on these listed files--I only get "no such file or directory" back. I'm sure this has something to do with permissions but I don't know what I'm missing here. Why is it I cannot find these files through the GUI (even when I select "show hidden files") or when I try to navigate to these files via the terminal using cd and ls commands? How can I remove these files (they are there aren't they?), one way or another, from my system? Your patience and time are appreciated

    Read the article

  • .pam_environment in kerberized nfs4 home directory

    - by Paul Stoever
    How can I get pam_env to read the user's .pam_environment file, if the user's file is located in a kerberized NFS4 mount? The file and directory permissions for the .pam_environment file are set in a way, that allows the local root to read the file. Reading .pam_environment only fails on the first login. Subsequent logins successfully read the file. The client uses Ubuntu 12.04 Desktop, NFS/Kerberos server is 12.04 Server. The Kerberos/NFS4 stuff works with exception of this. From /var/log/auth for first login: ... lightdm: pam_krb5(lightdm:auth): user USERNAME authenticated as USERNAME@REALM lightdm: pam_unix(lightdm:session): session closed for user lightdm lightdm: pam_env(lightdm:setcred): Unable to open config file: USERHOME/.pam_environment: Permission denied lightdm: pam_env(lightdm:setcred): Unable to open config file: USERHOME/.pam_environment: Permission denied lightdm: pam_unix(lightdm:session): session opened for user USERNAME by (uid=0) ...

    Read the article

  • How do I set PATH variables for all users on a server?

    - by Rob S.
    I just finished installing LaTeX for my company's Ubuntu server that we all SSH into to use. At the end of the install it says this: Add /usr/local/texlive/2010/texmf/doc/man to MANPATH, if not dynamically determined. Add /usr/local/texlive/2010/texmf/doc/info to INFOPATH. Most importantly, add /usr/local/texlive/2010/bin/x86_64-linux to your PATH for current and future sessions. So, my question is simply: How do I do this so that these variables are set for all users on the system? (And yes, I have sudo permissions). Thanks in advance to any and all responses I receive.

    Read the article

  • BlissControl Is a Settings Management Dashboard for Popular Social Networks

    - by Jason Fitzpatrick
    BlissControl is a simple web app that organizes the different settings menus of over a dozen social networks and services into a streamlined dashboard to help you change your profile pic, privacy settings, and more. Much like previously reviewed NotificationControl and MyPermissions (which help you check and set email notifications and app permissions, respectively), BlissControl also takes the very convoluted menus of web-apps and social media sites and makes them super easy to navigate. You can easily click right through the page you need on Facebook, Flickr, Twitter, and more–you’ll no longer need to visit each service and click through a maze of menus to get to the right place to change your password or swap your profile pic. BlissControl is simply a dashboard that directs you to the appropriate page within the service you already use–you never share your login credentials with BlissControl. Hit up the link below to take it for a spin. BlissControl [via AddictiveTips] How to Own Your Own Website (Even If You Can’t Build One) Pt 1 What’s the Difference Between Sleep and Hibernate in Windows? Screenshot Tour: XBMC 11 Eden Rocks Improved iOS Support, AirPlay, and Even a Custom XBMC OS

    Read the article

  • 12.04 Ambiance dark side bar issue when clicking folder on desktop

    - by Lou Crittenden
    Concerning an updated custom ambiance theme: why is there no dark side bar when I click on a folder, like the home folder icon on the desktop, but the theme works as planned when I type nautilus in the terminal to open the home folder or when opening a folder up as root? Permissions issue perhaps? Note: I am using Cinnamon instead of Unity and I noticed it uses the Nemo file manager instead of Nautilus and I suspected that it was causing me the grief. I uninstalled it and now am using Nautilus only as the file manager. I found this out when I typed: sudo killall nemo and the problem went away. I'll see how this goes (and hopefully cinnamon doesn't care about it...) Has anyone else had any issues with this?

    Read the article

  • Some files not copied when moving an encrypted home to a different partition

    - by Jon Herrin
    I have "successfully" moved my encrypted home to a separate partition using the instructions here: How can i move an encrypted home directory to another partition? However, some files are not being copied over. Most notably, I have a directory in my old home that contains the themes I use. This directory and it's contents are not copied over to the new home and therefore I come up with the default theme. Permissions on the directory that was not moved are identical to the other directories in home. Another discrepancy is that my Dropbox folder came over empty and had to resync itself. My concern is what else might be missing from the copied home. At this point, I've flipped back to the old home by re-editing /etc/fstab, but I'd really like to get /home cleanly and completely off of root without having to core the system.

    Read the article

  • .mdf Database Filetype

    - by James Izzard
    Would somebody be kind enough to correct my understanding of the following (if incorrect)? Microsoft's .mdf file-type can be used by both the LocalDB and the full Server database engines (apologies if engine is not the correct word?). The .mdf file does not care which of these two options are accessing it - so you could use either to access any given .mdf file, provided you had permissions and password etc. The LocalDB and the SQL Server are two options that can be interchangeably chosen to access .mdf files depending on the application requirements. Appreciate any clarification. Thanks

    Read the article

  • Security Issues When Creating Pages in SharePoint

    - by Damon
    I was speaking (or rather IM'ing) with Ben Collins a while back and he came across an interesting problem that I wanted to document for the sake of posterity.  If you have a SharePoint user who has permissions to create a page in a page library, but that user is having security issues trying to actually make a page, then it the security issue may be related to their access rights on the master page gallery.  Users who create pages must have at least restricted read access to the master page gallery for page creation to succeed. That is one of the joys of working in SharePoint. if something doesn't show up there is usually a good but obscure reason for it, but SharePoint certainly won't tell you outright why it is.  All I have to say is that I'm glad he ran into that issue and not me.

    Read the article

  • file copy error from system to cifs mount

    - by dwpriest
    When coping a file greater than 64kB from an Ubuntu server to a CIFS mounted windows share, most of the data is copied, but it seems the last chunk doesn't get copied. The size doesn't match, and the md5 check sums don't match. I have plenty of file space, but then I use cp, I get the following... cp: closing `cloudBackup/asdf.txt': No space left on device Using rsync, I get the following... rsync: close failed on "/home/fluffy/cloudBackup/.asdf.txt.qrBWe6": No space left on device (28) rsync error: error in file IO (code 11) at receiver.c(752) [receiver=3.0.8] rsync: connection unexpectedly closed (29 bytes received so far) [sender] rsync error: error in rsync protocol data stream (code 12) at io.c(601) [sender=3.0.8] I have full read/write permissions on the mounted share. I can copy locally and between the mount and system via SSH just fine. Any ideas? Thank you

    Read the article

  • ubuntu 12.04 ambiance dark side bar issue when clicking folder on desktop

    - by Lou Crittenden
    ubuntu 12.04 gnome: concerning an updated custom ambiance theme: why is there no dark side bar when I click on a folder, like the home folder icon on the desktop, but the theme works as planned when I type "nautilus" in the terminal to open the home folder or when opening a folder up as root? Permissions issue perhaps? note: i am using cinnamon instead of unity and i noticed it uses the nemo file manager instead of nautilus and i suspected that it was causing me the grief. so i uninstalled it and now am using nautilus only as the file manager. i found this out when i typed: "sudo killall nemo" and the problem went away. i'll see how this goes... (and hopefully cinnamon doesn't care about it...) has anyone else had any issues with this?

    Read the article

  • Have lampp use PHP code in a directory not under /opt/lampp

    - by Sundeep
    I have my lampp installed in the default /opt/lampp directory and the PHP code is in the htdocs folder. Now, to edit any of the files I have to use sudo permissions and have to type my password (or use sudo -i) which I do not want to do. All I want is to know if I can somehow make lampp use the code that is residing in a folder not located under /opt/lampp/. I tried giving full path and using a '..' relative path - both did not seem to work. Or is it okay to do my work in /opt/lampp/ folder by using sudo all the the time?

    Read the article

  • What is wrong with Unix/linux

    - by John Smith
    This is a genuine question motivated by Ideal Operating System When I moved from DOS to Linux in the late 90s it was an eye opener for me. Long file names, arbitrarily many extensions etc... Now I look at Linux and Unix and see all sorts of issues. Here are things I see which could be fixed. Too much depends on root, and rootly powers cannot be voluntarily delegated over several users. (I would love to give up my power to manager printers and delegate the job to another account) File permissions are very limited, and there is not much metadata to go with files. The "everything is a file" metaphor is not true, Plan 9 gets it right(er).

    Read the article

  • Wordpress asks for FTP credentials, even though owner of the whole /var/www is www-data

    - by dmt0
    I'm trying to get Wordpress to run on my local Ubuntu 12.10 installation. When I try to install themes/plugins, it asks me for FTP credentials. I've been trying to get this to work for 2 days now. Everywhere on the web it says you should change the owner:group of your Wordpress directory to whateverowner:group runs your apache server - in my case www-data:www-data. I've done this, and even tryed setting permissions to the whole /var/www directory to 777, restarting apache and the whole system. But wordpress is still asking me for the FTP credentials. What else could this be?

    Read the article

  • Dual Boot: access Mac OS HD folders from Ubuntu

    - by dresde
    I did it!!! I'm right now writting from Ubuntu 11.10 installed in my MacBook Pro using Dual Boot!!! THe only thing is, how can I now access my Mac folders? From Ubuntu if I try to open Music, Documents or any of those folders related to the Mac user I get the following: [The folder contents could not be displayed. You do not have the permissions necessary to view the contents of "Music"] I can access them if I run Nautilus from root (gksudo nautilus), but I would like to just be able to browse those folders. Thanks!

    Read the article

  • Dual Booting Windows 8 and Ubuntu 12.10 - thinkpad x230

    - by user110703
    I am having problems getting grub to load Windows 8 properly after installing Ubuntu 12.10 and Windows 8 on a solid state drive. Here's what I did: Fresh install of Windows 8 using USB recovery drive (partitioned SSD for UEFI) -- Tested windows install and it worked fine Built bootable USB with Ubuntu 12.10 64bit and installed Ubuntu -- Used Ubuntu's installer to partition the Windows 8 partition and install there Reboot - try to load windows 8 from grub -- Ubuntu loads correctly; windows load reports various problems with permissions and not being able to find files - I'll update what the actual errors are Tried to fix the boot problem using boot-repair: -- here's the output: http://paste.ubuntu.com/1384522/ So, this is my first time trying to setup a dual boot system and I think that UEFI is the main culprit in getting this to work correctly. What do I need to

    Read the article

  • rsync not working between NTFS/FAT and EXT

    - by wim
    I have music that I play in my car, from an FAT32 USB stick. The folder which I use to put songs on is stored on my EXT4 hard drive. I add/remove/retag songs regularly and occassionally want to rsync the changes to the USB stick. But for some unknown reason (maybe permissions?), rsync copies all the files every time rather than just changed ones. I am calling rsync like : rsync -vrlptgD source dest How can I make it work like I want it to (i.e. know when a file hasn't been changed and don't copy it)?

    Read the article

  • Reasons why crontab does not work

    - by Adam Matan
    Many a time and oft, crontab scripts are not executed as expected. There are numerous reasons for that, for example: wrong crontab notation, permissions, environment variables and many more. This community wiki aims to aggregate the top reasons for crontab scripts not being executed as expected. Write each reason in a separate answer. Please include one reason per answer - details about why it's not executed - and fix(es) for that one reason. Please write only cron-specific issues, e.g. commands that execute as expected from the shell but execute erroneously by cron.

    Read the article

  • Nautilus only starts as root user

    - by user7978
    Hello. I am running Ubuntu 10.04 64-bit. When I attempt to start Nautilus from the command line, it does not appear -- although a PID is generated. As root/sudo, I can start Nautilus fine. One note: I run e16 as the windows manager, so I do not use Nautilus to draw my desktop. However, even under this configuration, Nautilus used to run fine as a "regular" user. The permissions for Nautilus are the same as the other packages in /usr/bin. I believe this is a Gnome issue, but I'm fumbling at this point.

    Read the article

  • Solved: Chrome v18, self signed certs and &ldquo;signed using a weak signature algorithm&rdquo;

    - by David Christiansen
    So chrome has just updated itself automatically and you are now running v18 – great. Or is it… If like me, you are someone that are running sites using a self-signed SSL Certificate (i.e. when running a site on a developer machine) you may come across the following lovely message; Fear not, this is likely as a result of you following instructions you found on the apache openssl site which results in a self signed cert using the MD5 signature hashing algorithm. Using OpenSSL The simple fix is to generate a new certificate specifying to use the SHA512 signature hashing algorithm, like so; openssl req -new -x509 -sha512 -nodes -out server.crt -keyout server.key Simples! Now, you should be able to confirm the signature algorithm used is sha512 by looking at the details tab of certificate Notes If you change your certificate, be sure to reapply any private key permissions you require – such as allowing access to the application pool user.

    Read the article

< Previous Page | 236 237 238 239 240 241 242 243 244 245 246 247  | Next Page >