Search Results

Search found 17710 results on 709 pages for 'portable home directories'.

Page 22/709 | < Previous Page | 18 19 20 21 22 23 24 25 26 27 28 29  | Next Page >

  • How to create a link to Nintex Start Workflow Page in the document set home page

    - by ybbest
    In this blog post, I’d like to show you how to create a link to start Nintex Workflow Page in the document set home page. 1. Firstly, you need to upload the latest version of jQuery to the style library of your team site. 2. Then, upload a text file to the style library for writing your own html and JavaScript 3. In the document set home page, insert a new content editor web part and link the text file you just upload. 4. Update the text file with the following content, you can download this file here. <script type="text/javascript" src="/Style%20Library/jquery-1.9.0.min.js"></script> <script type="text/javascript" src="/_layouts/sp.js"></script> <script type="text/javascript"> $(document).ready(function() { listItemId=getParameterByName("ID"); setTheWorkflowLink("YBBESTDocumentLibrary"); }); function buildWorkflowLink(webRelativeUrl,listId,itemId) { var workflowLink =webRelativeUrl+"_layouts/NintexWorkflow/StartWorkflow.aspx?list="+listId+"&ID="+itemId+"&WorkflowName=Start Approval"; return workflowLink; } function getParameterByName(name) { name = name.replace(/[\[]/, "\\\[").replace(/[\]]/, "\\\]"); var regexS = "[\\?&]" + name + "=([^&#]*)"; var regex = new RegExp(regexS); var results = regex.exec(window.location.search); if(results == null){ return ""; } else{ return decodeURIComponent(results[1].replace(/\+/g, " ")); } } function setTheWorkflowLink(listName) { var SPContext = new SP.ClientContext.get_current(); web = SPContext.get_web(); list = web.get_lists().getByTitle(listName); SPContext.load(web,"ServerRelativeUrl"); SPContext.load(list, 'Title', 'Id'); SPContext.executeQueryAsync(setTheWorkflowLink_Success, setTheWorkflowLink_Fail); } function setTheWorkflowLink_Success(sender, args) { var listId = list.get_id(); var listTitle = list.get_title(); var webRelativeUrl = web.get_serverRelativeUrl(); var startWorkflowLink=buildWorkflowLink(webRelativeUrl,listId,listItemId) $("a#submitLink").attr('href',startWorkflowLink); } function setTheWorkflowLink_Fail(sender, args) { alert("There is a problem setting up the submit exam approval link"); } </script> <a href="" target="_blank" id="submitLink"><span style="font-size:14pt">Start the approval process.</span></a> 5. Save your changes and go to the document set Item, you will see the link is on the home page now. Notes: 1. You can create a link to start the workflow using the following build dynamic string configuration: {Common:WebUrl}/_layouts/NintexWorkflow/StartWorkflow.aspx?list={Common:ListID}&ID={ItemProperty:ID}&WorkflowName=workflowname. With this link you will still need to click the start button, this is standard SharePoint behaviour and cannot be altered. References: http://connect.nintex.com/forums/27143/ShowThread.aspx How to use html and JavaScript in Content Editor web part in SharePoint2010

    Read the article

  • vsftpd restrict local users to home and group directories

    - by wag2639
    i've got vsftpd install on an ubuntu server 9.10 i can use chroot to restrict users to their own home directories but i also want to give them access to a group shared folder for example, users foo1 and foo2 are local users in the group foos i want foo1 to have access to /home/foo1 and /svr/foos and foo2 to have access to /home/foo2 and /svr/foos other notes: using pam and enforce local user ssl already tried mount --bind but it does weird permissions when you try to mount bind multiple users to the same

    Read the article

  • Video converter to convert any format to light-weight avi format(for mobile and portable multimedia

    - by infant programmer
    These answers don't satisfy MY needs. My mobile supports .3gp and .avi formats 3gp files are always smaller in size but with least quality (especially audio part) Avi (certainly) exhibits better quality but the video converter I am using (namely Xillisoft VidConverter) outputs avi file with very high size, which isn't suitable for portable devices So I'm looking for (essentially free or open source) software that creates smaller files with a better quality than 3gp! thank you :-)

    Read the article

  • Physical Directories vs. MVC View Paths

    - by Rick Strahl
    This post falls into the bucket of operator error on my part, but I want to share this anyway because it describes an issue that has bitten me a few times now and writing it down might keep it a little stronger in my mind. I've been working on an MVC project the last few days, and at the end of a long day I accidentally moved one of my View folders from the MVC Root Folder to the project root. It must have been at the very end of the day before shutting down because tests and manual site navigation worked fine just before I quit for the night. I checked in changes and called it a night. Next day I came back, started running the app and had a lot of breaks with certain views. Oddly custom routes to these controllers/views worked, but stock /{controller}/{action} routes would not. After a bit of spelunking I realized that "Hey one of my View Folders is missing", which made some sense given the error messages I got. I looked in the recycle bin - nothing there, so rather than try to figure out what the hell happened, just restored from my last SVN checkin. At this point the folders are back… but… view access  still ends up breaking for this set of views. Specifically I'm getting the Yellow Screen of Death with: CS0103: The name 'model' does not exist in the current context Here's the full error: Server Error in '/ClassifiedsWeb' Application. Compilation ErrorDescription: An error occurred during the compilation of a resource required to service this request. Please review the following specific error details and modify your source code appropriately.Compiler Error Message: CS0103: The name 'model' does not exist in the current contextSource Error: Line 1: @model ClassifiedsWeb.EntryViewModel Line 2: @{ Line 3: ViewBag.Title = Model.Entry.Title + " - " + ClassifiedsBusiness.App.Configuration.ApplicationName; Source File: c:\Projects2010\Clients\GorgeNet\Classifieds\ClassifiedsWeb\Classifieds\Show.cshtml    Line: 1 Compiler Warning Messages: Show Detailed Compiler Output: Show Complete Compilation Source: Version Information: Microsoft .NET Framework Version:4.0.30319; ASP.NET Version:4.0.30319.272 Here's what's really odd about this error: The views now do exist in the /Views/Classifieds folder of the project, but it appears like MVC is trying to execute the views directly. This is getting pretty weird, man! So I hook up some break points in my controllers to see if my controller actions are getting fired - and sure enough it turns out they are not - but only for those views that were previously 'lost' and then restored from SVN. WTF? At this point I'm thinking that I must have messed up one of the config files, but after some more spelunking and realizing that all the other Controller views work, I give up that idea. Config's gotta be OK if other controllers and views are working. Root Folders and MVC Views don't mix As I mentioned the problem was the fact that I inadvertantly managed to drag my View folder to the root folder of the project. Here's what this looks like in my FUBAR'd project structure after I copied back /Views/Classifieds folder from SVN: There's the actual root folder in the /Views folder and the accidental copy that sits of the root. I of course did not notice the /Classifieds folder at the root because it was excluded and didn't show up in the project. Now, before you call me a complete idiot remember that this happened by accident - an accidental drag probably just before shutting down for the night. :-) So why does this break? MVC should be happy with views in the /Views/Classifieds folder right? While MVC might be happy, IIS is not. The fact that there is a physical folder on disk takes precedence over MVC's routing. In other words if a URL exists that matches a route the pysical path is accessed first. What happens here is that essentially IIS is trying to execute the .cshtml pages directly without ever routing to the Controller methods. In the error page I showed above my clue should have been that the view was served as: c:\Projects2010\Clients\GorgeNet\Classifieds\ClassifiedsWeb\Classifieds\Show.cshtml rather than c:\Projects2010\Clients\GorgeNet\Classifieds\ClassifiedsWeb\Views\Classifieds\Show.cshtml But of course I didn't notice that right away, just skimming to the end and looking at the file name. The reason that /classifieds/list actually fires that file is that the ASP.NET Web Pages engine looks for physical files on disk that match a path. IOW, when calling Web Pages you drop the .cshtml off the Razor page and IIS will serve that just fine. So: /classifieds/list looks and tries to find /classifieds/list.cshtml and executes that script. And that is exactly what's happening. Web Pages is trying to execute the .cshtml file and it fails because Web Pages knows nothing about the @model tag which is an MVC specific template extension. This is why my breakpoints in the controller methods didn't fire and it also explains why the error mentions that the @model key word is invalid (@model is an MVC provided template enhancement to the Razor Engine). The solution of course is super simple: Delete the accidentally created root folder and the problem is solved. Routing and Physical Paths I've run into problems with this before actually. In the past I've had a number of applications that had a physical /Admin folder which also would conflict with an MVC Admin controller. More than once I ended up wondering why the index route (/Admin/) was not working properly. If a physical /Admin folder exists /Admin will not route to the Index action (or whatever default action you have set up, but instead try to list the directory or show the default document in the folder. The only way to force the index page through MVC is to explicitly use /Admin/Index. Makes perfect sense once you realize the physical folder is there, but that's easy to forget in an MVC application. As you might imagine after a few times of running into this I gave up on the Admin folder and moved everything into MVC views to handle those operations. Still it's one of those things that can easily bite you, because the behavior and error messages seem to point at completely different  problems. Moral of the story is: If you see routing problems where routes are not reaching obvious controller methods, always check to make sure there's isn't a physical path being mapped by IIS instead. That way you won't feel stupid like I did after trying a million things for about an hour before discovering my sloppy mousing behavior :-)© Rick Strahl, West Wind Technologies, 2005-2012Posted in MVC   IIS7   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Advice needed for a home network setup (hardware & software) to handle many clients and potentially heavy traffic

    - by posdef
    I have recently decided to re-structure the home network of our flatshare here. Here's a quick outline of the situation. I envision to have the following 4 devices connected to the router via cable: Xbox 360 IP phone Printer QNAP server (Web, File and Multimedia) We are three people living here, so on top of that there will be to 5-6 computers/mobile devices connecting as wireless clients. My goal is to be able to transfer files (when needed) between the computer and the Multimedia server, which I can reach via 360 and play on the TV. I also would like to keep a high level of security; right now I have the encryption on WPA2 and MAC filtering. I don't believe the web server will get heavy traffic, though I would like to have it responsive. Likewise, I don't have a habit of downloading via torrent etc, but I greatly appreciate my network being responsive and fast, especially when I am browsing or streaming high quality media. Now my questions are: is this setup feasible? smart? efficient? can this be improved somehow? my current router (D-Link DI624) and the previous one (DI-524) used to have spontaneous drops in network, which I find highly irritating. I don't believe in my router, especially now that it completely crashed when I was test-running the setup by transferring a large media file to server while xbox was playing music from the server, and two computers browsing the net. Do I need to get new hardware, if so, any recommendations for a reliable and fast router?

    Read the article

  • My website directories downloads instead of actually opening up from browser

    - by numerical25
    I added some screencast to show what I am having issues with http://screencast.com/t/212t3ANINqk http://screencast.com/t/bR44U1wkvNZl http://screencast.com/t/iDS7APYYsa but the page downloads my subdirectories instead of opening them up and displaying the index file of that page Here is the situation. I am trying to get my web service up using mac ports and I am just trying to configure all the files. I am using php, apache, etc. the page goes to the localhost root but anything beyond that. it can not find. edit Ive tried to add the following to httpd.conf within the <IfModule mime_module> but no hope AddType application/x-httpd-php .php AddType application/x-httpd-php .phtml AddType application/x-httpd-php .php3 AddType application/x-httpd-php .php4 AddType application/x-httpd-php .html AddType application/x-httpd-php-source .phps

    Read the article

  • Website hosted at home pingable from outside, but not browseable from outside

    - by Richard DesLonde
    I have a simple setup. Server at home has local I.P. 192.168.1.3 IIS is running on the server and the website is up. Windows firewall on the server has an exception rule for port 80 TCP Router has static I.P. XX.XXX.XX.XXX Router is forwarding TCP port 80 to 192.168.1.3 My domain registrar is my DNS host and is pointing to the static I.P. XX.XXX.XX.XXX of the router Here's what I can and can't do. I can browse the website from within my home network either by I.P. or domain name. I can ping the domain and the I.P. from outside the network (from a computer at work). I can't browse the website either by domain name or by I.P. Wierd. Why I can't browse my website? Incidentally, I wasn't sure this question was appropriate for SO, but after finding a few others similar to it on SO, and no comments on those questions saying anything about it being innapropriate, I decided I would post this question. Let me know if this is not appropriate for SO, or is more appropriate for another of the SE websites. Thanks!

    Read the article

  • Apache Not Accepting a Path in My Home Folder

    - by Promather
    I have trying to set up an Apache site to use a folder in my home folder without any success. I exactly followed the steps in this page: https://help.ubuntu.com/community/ApacheMySQLPHP yet I did not succeed; I keep getting error 403, which says that the server doesn't have permission to access the requested page. I searched forums and many suggested changing the permission of the folder. I went straight away and set the permission to 777, but that didn't solve the problem. I made another search and somebody gave me a clue, which is that it could be because my home folder is encrypted. I believe this could be the problem, but: What is the relation between encryption and Apache? I suppose Apache server is requesting the file from the system, rather than trying to access the file bytes! Is there anyway to solve this problem? I don't want to move the folder to /var/www because I am using this Apache for testing, so I want whatever change I make to be immediately reflected, rather than having to copy files which is error prone.

    Read the article

  • [MINI HOW-TO] Repair Missing External Hard Drive Database Error in WHS

    - by Mysticgeek
    If you’re using external hard drives with your Windows Home Server, they might get unplugged and create an error. Here we look at running the Repair Wizard to quickly fix the issue. If an external drive that is included in your drive pool becomes unplugged or loses power, you might see the following error under Home Network Health when opening the WHS Console. To fix the issue verify the drive has power and is plugged in correctly and click Repair. The wizard launches and you’ll need to agree that you may lose data during the repair of the backup database. In this example it was a simple problem where an external drive became unplugged from the server…so you can close out of the wizard. If you look under Server Storage you can see the drive is missing…To fix the issue verify the drive has power and is plugged in correctly. WHS will add the drive back into the pool and when finished you’ll see it listed as healthy and good to go. Using external drives that are part of your storage pool may not be the best way to have your home built WHS setup, but if you do, expect occasional errors such as this. Similar Articles Productive Geek Tips Find Your Missing USB Drive on Windows XPFixing "BOOTMGR is missing" Error While Trying to Boot Windows VistaSpeed up External USB Hard Drives in Windows VistaRebit Backup Software [Review]Troubleshoot Startup Problems with Startup Repair Tool in Windows 7 & Vista TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips HippoRemote Pro 2.2 Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Know if Someone Accessed Your Facebook Account Shop for Music with Windows Media Player 12 Access Free Documentaries at BBC Documentaries Rent Cameras In Bulk At CameraRenter Download Songs From MySpace Steve Jobs’ iPhone 4 Keynote Video

    Read the article

  • Home server hard drive: 186k start-stop cycles in 325 days?

    - by j-g-faustus
    I set up a home server about a year ago, using Ubuntu server (10.04 LTS at the moment), four disks in RAID 5 for storage (WD Green 1.5 TB) and a laptop drive for the OS. Today the output of smartctl, a command line utility for checking the SMART attributes of a hard drive, tells me that the primary OS drive has had no less than 186,000 start-stop cycles in 325 days and may be nearing the end of its lifespan. The smartctl output is in "normalized values", in this case a number between 200 and 000, where 200 is "brand new" and 000 means "worn out". My disk gets 001. So I wonder what happened: 186k start/stop cycles in 7820 hours is about one start/stop per 2.5 minutes around the clock. This seems somewhat excessive for a computer that sees actual use once or twice per day. (The RAID disks are normal, averaging to one start/stop per day, as expected.) Does anyone have similar experiences, or pointers to what might be the issue here? Specifically I'd like to know Why the massive start/stop count? Do I have some sort of configuration issue? Could there be a background service that is causing trouble? Could having a laptop disk as the OS drive be part of the problem? Can anyone confirm or deny this? Here is the /etc/hdparm.conf configuration /dev/sda { apm = 127 spindown_time = 120 } and the most relevant parts of smartctl --attributes /dev/sda: smartctl version 5.38 [x86_64-unknown-linux-gnu] Copyright (C) 2002-8 Bruce Allen === START OF READ SMART DATA SECTION === SMART Attributes Data Structure revision number: 16 Vendor Specific SMART Attributes with Thresholds: ID# ATTRIBUTE_NAME FLAG VALUE WORST THRESH TYPE UPDATED WHEN_FAILED RAW_VALUE 1 Raw_Read_Error_Rate 0x002f 200 200 051 Pre-fail Always - 0 4 Start_Stop_Count 0x0032 001 001 000 Old_age Always - 185875 9 Power_On_Hours 0x0032 090 090 000 Old_age Always - 7820 12 Power_Cycle_Count 0x0032 100 100 000 Old_age Always - 109 193 Load_Cycle_Count 0x0032 118 118 000 Old_age Always - 246833 194 Temperature_Celsius 0x0022 107 098 000 Old_age Always - 36 As I generally prefer my drives to last more than a year, any advice is appreciated.

    Read the article

  • Will adding top level directories with similar structure to existing directories change the SEO of my site?

    - by Russell Sims
    I've been pointed this way for SEO related questions and this one has had me pondering for a little while now. I'm recreating a site's structure. The website's content is generated through several feeds and unless I want to place each and every - of the 10,000 odd - venues into their own category manually, I can't avoid categorising each item by using its address. The current the structure looks like this Homepage > region > county > city/town > venue page and the URL looks like domain/region/county/city/venue/ I'm relatively happy to use this structure as it's not too convoluted. However we also promote deals and we also group the venues into their respective franchise, so that leads to URLs such as: domain/groups AND domain/deals My question is: how would the directory structure look with these new additions? Would I have a URL that looks like domain/deals/region/county/city/venue or domain/group/region/county/city/venue and just put a 301 or a canonical link tag on the page to prevent the duplicate pages competing with each other? Am I just worrying about it needlessly and perhaps link straight from domain/deals to the venue page URL domain/region/county/city/venue, this bothers me a bit though as the deals and groups will not be in the breadcrumbs.

    Read the article

  • Using subdomains or directories for main categories?

    - by Matthieu
    I have a website which references places to travel to in the world. Those places are (of course) grouped by countries. Here is an example of an actual URL of my website: http://awespot.org/country/105/iceland I am wondering if it would be better, in terms of SEO, to have the countries separated in subdomains: http://iceland.awespot.org/ I know subdomains are considered by Google as different websites, so I am considering the 2 options: separating would mean also separating the pagerank and benefits of links to the website but separating would enable me to create a "web" or related websites (all related to travel and all) that link and benefit to each other I am only asking about SEO here, I know there are other questions raised by this possibility (user experience, administration... even password completion by web browsers)

    Read the article

  • What happens to remounted data/directories

    - by cauon
    According to suggestions in this post I am trying to improve my system to run better with a Solid State Drive. But regarding to RAMdisks and /etc/fstab usage I have some understanding problems coming up. So let's say I add the following lines to /etc/fstab tmpfs /tmp tmpfs defaults,noatime,nodiratime,mode=1777 0 0 tmpfs /var/spool tmpfs defaults,noatime,nodiratime,mode=1777 0 0 tmpfs /var/tmp tmpfs defaults,noatime,nodiratime,mode=1777 0 0 tmpfs /var/log tmpfs defaults,noatime,nodiratime,mode=0755 0 0 I know that on startup these locations should now get mounted into the RAM (hopefully). But what happens to the physical space that was mounted on those places before? Is it gone? Will it be back when I edit my /etc/fstab back to the Version without tmpfs? Will the space still be allocated on my SSD in a way that I can't use it for any other data? Sometimes it is suggested to add the following line, too: none /var/cache aufs dirs=/tmp:/var/cache=ro 0 0 What does this actually do? I noticed that /var/cache takes almost 1GB of space on my harddisk. So should i clear the directory before activating this line? (this is related to the former question) This causes me some confusions and I hope you can give me some clarifications. UPDATE I've downloaded a image with 600MB in size into /tmp that is mounted with the tmpfs settings above. Now I wanted to compare the RAM usage before and after the download. I expect the RAM usage to be increased by 600MB after the download. But the System Monitoring Tool showed me no changes at all. How can this be? Does tmpfs work other than I actually expect it to?

    Read the article

  • Portable hard disk with write-protect switch

    - by Wadih M.
    Hi, I'm looking for portable hard disks that have a physical switch (with or without key) to make it "read only" to prevent modification. Is there a particular keyword I should look for that designate those types of drives, or models I can look into, anything that can guide me will help. Thanks

    Read the article

  • Compact DIY Office-in-a-Cart Packs Away Into a Closet

    - by Jason Fitzpatrick
    Many geeks know the pain of losing a home office when a new baby comes along, but not many of them go to such lengths to miniaturize their offices like this. With a little ingenuity an entire home office now fits inside a heavily modified IKEA work table. Ian, an IKEAHacker reader and Los Angeles area geek, explains the motivation for the build: I had to surrender my home office to make room for my new baby boy ;) I took an Ikea stainless steel kitchen “work table”, some Ikea computer tower desk trays, two steel tabletops, and two grated steel shelves to make an “office” that I could pack away into a closet. Hit up the link below to check out the full photo set, the build includes quite a few clever design choices like mounted monitors, a ventilation system, and more. Home Office In A Box [IKEAHacker] HTG Explains: How Antivirus Software Works HTG Explains: Why Deleted Files Can Be Recovered and How You Can Prevent It HTG Explains: What Are the Sys Rq, Scroll Lock, and Pause/Break Keys on My Keyboard?

    Read the article

  • Ubuntu 14.04 has a bunch of old kernel directories

    - by NoBugs
    I saw in Disk Usage Analyzer I have 3.13.0-xx for 8 minor versions of the kernel in /lib/modules. Each is around 200MB. I remember having to go through in Synaptic and remove those old Linux versions before, but hasn't this bug been fixed? Is it just paranoid default setting, that perhaps all of the last half dozen kernels might become unbootable, so it keeps each old one around? Or do I have some developer setting enabled by accident that causes this?

    Read the article

  • Use HDD enclosure as portable hard drive?

    - by jeffmangum
    Someone here suggested that i should use an external hdd enclosure since my pc doesnt support a 2nd hdd and i want to use my old hdd. I will be permanently plugging that hdd enclosure to my pc and someone said it isnt safe. I'm really concerned since i would be using that drive and i'll probably be always using it since i'd play the music files i have in there. Also, would it be ok if i use that enclosure as a portable external drive?

    Read the article

  • Nautilus tags/labels/marks/emblems for directories

    - by madox2
    Is there any way how to mark folders or files with tags(or labels or whatever) in Nautilus? It would be nice to sort marked folders or files through this tags. I am familiar with emblems but I cant sort through it and its complicated to assign emblem to folder(3 clicks at least). Especially my first idea was to mark folders in my Movie directory with tags seen, not seen, must see, and so on. Then I realized it would be useful in any other workspace with any custom tags... Is there any nautilus extension for this? Or any other file manager which can do this?

    Read the article

  • Data recovery on an Iomega portable drive.

    - by Kaji
    For Christmas, my little brother got an Iomega 500GB portable hard drive. It'd been working well, but last week it flat died, and the company's trying to shirk it, claiming it's not under warranty and saying it'll cost at least $900 to recover the data from the drive. He's still trying to fight the warranty thing, but wants to know, should it boil down to it, what other options exist for recovering the data from the drive. (in before "BACK UP!")

    Read the article

  • ssh many users to one home

    - by filippo
    Hiya, I want to allow some trusted users to scp files into my server (to an specific user), but I do not want to give these users a home, neither ssh login. I'm having problems to understand the correct settings of users/groups I have to create to allow this to happen. I will put an example; Having: MyUser@MyServer MyUser belongs to the group MyGroup MyUser's home will be lets say, /home/MyUser SFTPGuy1@OtherBox1 SFTPGuy2@OtherBox2 They give me their id_dsa.pub's and I add it to my authorized_keys I reckon then, I'd do in my server something like useradd -d /home/MyUser -s /bin/false SFTPGuy1 (and the same for the other..) And for the last, useradd -G MyGroup SFTPGuy1 (then again, for the other guy) I'd expect then, the SFTPGuys to be able to sftp -o IdentityFile=id_dsa MyServer and to be taken to MyUser's home... Well, this is not the case... SFTP just keeps asking me for a password. Could someone point out what am I missing? Thanks a mil, f. [EDIT: Messa in StackOverflow asked me if authorized_keys file was readable to the other users (members of MyGroup). Its an interesting point, this was my answer: Well, it wasn't (it was 700), but then I changed the permissions of the .ssh dir and the auth file to 750 though still no effect. Guess it's worth mentioning that my home dir ( /home/MyUser) is also readable for the group; most dirs being 750 and the specific folder where they'd drop files is 770. Nevertheless, about the auth file, I reckon the authentication would be performed by the local user on MyServer, isn't it? if so, I don't understand the need for other users to read it... well.. just wondering. ]

    Read the article

  • I loose some directories when i upgrade from Ubuntu 11.10 to 12.04

    - by maythux
    last day i upgraded my ubunut 11.10 desktop to ubuntu 12.04. I was running a KVM virtual about 7 machines and managed by virt-manage software.... anyway when i finished upgrading i found that virt-manager is not working so i have to reconfigure it again and install some other missing packages that was deleted!!!! anyway i solve this issue...then i started to restore my virtual machines i restore 2 machines without any problems the third and fourth ones (windows) make a check disk that takes more that 6 hours but finally it works... other machines i cant find their attached hard disks i don't know what happens but i cant found that files. 1- upgrading delete files!?!! 2- Is there anyway to restore those files? thanks in advance

    Read the article

  • Adding your website to free web directories as a link building strategy

    - by Man
    It's been two month I've launched a website. I recently ran into some websites which list directory of other websites. Some examples can be http://www.addgoogleurl.com/ and webdirectorieslist.com, etc. I was talking with my colleague and he says, adding the URL of my website to these kind of websites will negate the effect of other organic real links. Does google consider positive/negative points for these kind of links from web directory websites? Do you have any source for your answer to refer to? I found this question asked before on webmasters.SE, this is asking about many links from a website.

    Read the article

< Previous Page | 18 19 20 21 22 23 24 25 26 27 28 29  | Next Page >