Search Results

Search found 9558 results on 383 pages for 'share this'.

Page 8/383 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • How do I purge or empty Windows Explorer's network username and sharename cache?

    - by Abel
    While troubleshooting a Samba vs Windows Network issue, I noticed that Windows' Explorer remembers login credentials of remote shares, even if you ask it not to. For instance, after accessing a share using \\servername\sharename plus entering username/password and then closing Windows Explorer, adding the same share as a network drive gives the following message, regardless whether the username is the same or not: The network folder specified is currently mapped using a different user name and password. To connect using a different user name and password, first disconnect any existing mappings to this network share. Using NET USE does not show the share. After restarting the computer, I have no problems accessing the share using different credentials. But restarting just for testing other credentials is annoying, esp. while troubleshooting. How can I purge this cache, using Windows Vista? Note: using nbtstat -R[R], ipconfig /renew, killing explorer.exe or disabling / re-enabling the network card didn't help.

    Read the article

  • I can connect to Samba server but cannot access shares.

    - by jlego
    I'm having trouble getting samba sharing working to access shares. I have setup a stand-alone box running Fedora 16 to use as a file-sharing and web development server. It needs to be able to share files with a Windows 7 PC and a Mac running OSX Snow Leopard. I've setup Samba using the Samba configuration GUI tool on Fedora. Added users to Fedora and connected them as Samba users (which are the same as the Windows and Mac usernames and passwords). The workgroup name is the same as the Windows workgroup. Authentication is set to User. I've allowed Samba and Samba client through the firewall and set the ethernet to a trusted port in the firewall. Both the Windows and Mac machines can connect to the server and view the shares, however when trying to access the shares, Windows throws error: 0x80070035 " Windows cannot access \\SERVERNAME\ShareName." Windows user is not prompted for a username or password when accessing the server (found under "Network Places"). This also happens when connecting with the IP rather than the server name. The Mac can also connect to the server and see the shares but when choosing a share gives the error: The original item for ShareName cannot be found. When connecting via IP, the Mac user is prompted for username and password, which when authenticated gives a list of shares, however when choosing a share to connect to, the error is displayed and the user cannot access the share. Since both machines are acting similarly when trying to access the shares, I assume it is an issue with how Samba is configured. smb.conf: [global] workgroup = workgroup server string = Server log file = /var/log/samba/log.%m max log size = 50 security = user load printers = yes cups options = raw printcap name = lpstat printing = cups [homes] comment = Home Directories browseable = no writable = yes [printers] comment = All Printers path = /var/spool/samba browseable = yes printable = yes [FileServ] comment = FileShare path = /media/FileServ read only = no browseable = yes valid users = user1, user2 [webdev] comment = Web development path = /var/www/html/webdev read only = no browseable = yes valid users = user1 How do I get samba sharing working? UPDATE: I Figured it out, it was because I was sharing a second hard drive. See checked answer below. Speculation 1: Before this box I had another box with the same version of fedora installed (16) and samba working for these same computers. I started up the old machine and copied the smb.conf file from the old machine to the new one (editing the share definitions for the new shares of course) and I still get the same errors on both client machines. The only difference in environment is the hardware and the router. On the old machine the router received a dynamic public IP and assigned dynamic private IPs to each device on the network while the new machine is connected to a router that has a static public IP (still dynamic internal IPs though.) Could either one of these be affecting Samba? Speculation 2: As the directory I am trying to share is actually an entire internal disk, I have tried these things: 1.) changing the owner of the mounted disk from root to my user (which is the same username as on the Windows machine) 2.) made a share that only included one of the folders on the disk instead of the entire disk with my user again as the owner. Both tests failed giving me the same errors regarding the network address. Speculation 3: Whenever I try to connect to the share on the Windows 7 client I am prompted for my username and password. When I enter the correct credentials I get an access denied message. However I did notice that under the login box "domain: WINDOWS-PC-NAME" is listed. I believe this could very well be the problem. Speculation 4: So I've completely reinstalled Fedora and Samba now. I've created a share on the first harddrive (one fedora is installed on) and I can access that fine from Windows. However when I try to share any data on the second disk, I am receiving the same error. This I believe is the problem. I think I need to change some things in fstab or fdisk or something. Speculation 5: So in fstab I mapped the drive to automount in a folder which works correctly. I also added the samba_share_t SElinux label to the mountpoint directory which now allows me to access the shares on the Windows machine, however I cannot see any of the files in the directory on the windows machine. (They are there, I can see them in the fedora file browser locally)

    Read the article

  • Download and Share Visual Studio Color Schemes

    - by ScottGu
    As developers we often spend a large part of our day staring at code within Visual Studio.  If you are like me, after awhile the default VS text color scheme starts to get a little boring. The good news is that Visual Studio allows you to completely customize the editor background and text colors to whatever you want – allowing you to tweak them to create the experience that is “just right” for your eyes and personality.  You can then optionally export/import your color scheme preferences to an XML file via the Tools->Import and Export Settings menu command. [In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu] New website that makes it easy to download and share VS color schemes Luke Sampson launched the http://studiostyles.info/ site a week ago (built using ASP.NET MVC 2, ASP.NET 4 and VS 2010). Studiostyles.info enables you to easily browse and download Visual Studio color schemes that others have already created.  The color schemes work for both VS 2008 and VS 2010 (all versions – including the free VS express editions): Color schemes are sorted by popularity and voting (you can vote on whether you find each “hot or not”).  You can click any of the schemes to see screen-shots of it in use for common coding scenarios.  You can then download the color settings for either VS 2010 or VS 2008: You can also optionally upload color schemes of your own if you have a good one you want to share with others.  If you haven’t visited it yet – check it out: http://studiostyles.info/  And thank you Luke Sampson for building it! Hope this helps, Scott

    Read the article

  • How to share a folder using the Ubuntu One Web API

    - by Mario César
    I have successfully implement OAuth Authorization with Ubuntu One in Django, Here are my views and models: https://gist.github.com/mariocesar/7102729 Right now, I can use the file_storage ubuntu api, for example the following, will ask if Path exists, then create the directory, and then get the information on the created path to probe is created. >>> user.oauth_access_token.get_file_storage(volume='/~/Ubuntu One', path='/Websites/') <Response [404]> >>> user.oauth_access_token.put_file_storage(volume='/~/Ubuntu One', path='/Websites/', data={"kind": "directory"}) <Response [200]> >>> user.oauth_access_token.get_file_storage(volume='/~/Ubuntu One', path='/Websites/').json() {u'content_path': u'/content/~/Ubuntu One/Websites', u'generation': 10784, u'generation_created': 10784, u'has_children': False, u'is_live': True, u'key': u'MOQgjSieTb2Wrr5ziRbNtA', u'kind': u'directory', u'parent_path': u'/~/Ubuntu One', u'path': u'/Websites', u'resource_path': u'/~/Ubuntu One/Websites', u'volume_path': u'/volumes/~/Ubuntu One', u'when_changed': u'2013-10-22T15:34:04Z', u'when_created': u'2013-10-22T15:34:04Z'} So it works, it's great I'm happy about that. But I can't share a folder. My question is? How can I share a folder using the api? I found no web api to do this, the Ubuntu One SyncDaemon tool is the only mention on solving this https://one.ubuntu.com/developer/files/store_files/syncdaemontool#ubuntuone.platform.tools.SyncDaemonTool.offer_share But I'm reluctant to maintain a DBUS and a daemon in my server for every Ubuntu One connection I have authorization for. Any one have an idea how can I using a web API to programmatically share a folder? even better using the OAuth authorization tokens that I already have.

    Read the article

  • Gartner Market Share: Oracle is #1 in PPM for WW revenues

    - by Melissa Centurio Lopes
    v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} Normal 0 false false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} By Sylvie Mackenzie, PMP The Gartner report published March 2014, Market Share: All Software Markets, Worldwide, 2013 shows Oracle as the leader in the Project and Portfolio Management space, with a market share of 22.5% and growth rate of 4.9%. Gartner WW PPM Vendor Share, 2013

    Read the article

  • rkhunter: right way to handle warnings further?

    - by zuba
    I googled some and checked out two first links it found: http://www.skullbox.net/rkhunter.php http://www.techerator.com/2011/07/how-to-detect-rootkits-in-linux-with-rkhunter/ They don't mention what shall I do in case of such warnings: Warning: The command '/bin/which' has been replaced by a script: /bin/which: POSIX shell script text executable Warning: The command '/usr/sbin/adduser' has been replaced by a script: /usr/sbin/adduser: a /usr/bin/perl script text executable Warning: The command '/usr/bin/ldd' has been replaced by a script: /usr/bin/ldd: Bourne-Again shell script text executable Warning: The file properties have changed: File: /usr/bin/lynx Current hash: 95e81c36428c9d955e8915a7b551b1ffed2c3f28 Stored hash : a46af7e4154a96d926a0f32790181eabf02c60a4 Q1: Is there more extended HowTos which explain how to deal with different kind warnings? And the second question. Were my actions sufficient to resolve these warnings? a) To find the package which contains the suspicious file, e.g. it is debianutils for the file /bin/which ~ > dpkg -S /bin/which debianutils: /bin/which b) To check the debianutils package checksums: ~ > debsums debianutils /bin/run-parts OK /bin/tempfile OK /bin/which OK /sbin/installkernel OK /usr/bin/savelog OK /usr/sbin/add-shell OK /usr/sbin/remove-shell OK /usr/share/man/man1/which.1.gz OK /usr/share/man/man1/tempfile.1.gz OK /usr/share/man/man8/savelog.8.gz OK /usr/share/man/man8/add-shell.8.gz OK /usr/share/man/man8/remove-shell.8.gz OK /usr/share/man/man8/run-parts.8.gz OK /usr/share/man/man8/installkernel.8.gz OK /usr/share/man/fr/man1/which.1.gz OK /usr/share/man/fr/man1/tempfile.1.gz OK /usr/share/man/fr/man8/remove-shell.8.gz OK /usr/share/man/fr/man8/run-parts.8.gz OK /usr/share/man/fr/man8/savelog.8.gz OK /usr/share/man/fr/man8/add-shell.8.gz OK /usr/share/man/fr/man8/installkernel.8.gz OK /usr/share/doc/debianutils/copyright OK /usr/share/doc/debianutils/changelog.gz OK /usr/share/doc/debianutils/README.shells.gz OK /usr/share/debianutils/shells OK c) To relax about /bin/which as I see OK /bin/which OK d) To put the file /bin/which to /etc/rkhunter.conf as SCRIPTWHITELIST="/bin/which" e) For warnings as for the file /usr/bin/lynx I update checksum with rkhunter --propupd /usr/bin/lynx.cur Q2: Do I resolve such warnings right way?

    Read the article

  • share distribution question

    - by facebook-100000781341887
    Hi, I just developed a facebook game(mifia like), but the graphic I make is not good, because it is reference with some existing photo, trace with AI, and coloring it. Therefore, I invite my friend to join me, he is a graphic designer, own a company with his friend (I know both of them), for the share, I expect at least 70% for me, and at most 30% for them (both of them want to join). Therefore, they give me a counter offer, 60% for me and 40% for them, of course, I feel their counter offer is unacceptable because they only build the image in part time, and all the other work just like coding, webhosting...etc, is what I do in full time. Why they said they worth 40% is that they will make a good graphic, they can provide a advertise channel(on local magazine), etc... Actually, I don't think the game need advertisement on local magazine because the game is not target for local... Please give me some comments on this issue(is the share fair? what is the importance of the image of the game, is it worth more than 30%), or can anyone share the experience on this. Thanks in advance.

    Read the article

  • Sharing files between multiple sites using only desktop software

    - by perlyking
    Our organisation has three sites; a head office, where the master copies of company files are stored, plus two branch offices using only workstations and a NAS or two. Currently we're talking about <10GB. At the main office, we have no admin access to the file server, as this is entirely controlled by the larger institution where we are located. For the same reason, we have no VPN remote access to this network. Instead, we simply have access to a network share using over a Novell LAN. Question: how can we share files between offices in way that minimises latency, i.e. that gives us a mirror of the main network share at each site? (There is little likelihood of concurrent editing, and we can live with the odd file conflict now and again). Up to now branch office staff have had to use GotoMyPC-type solutions to remotely access files held at the main office. Or email. I was hoping to use Google Drive on a dedicated workstation at each office to sync the contents of the network share (head office) or NAS (branch offices) via the cloud, but at my last attempt (29 Jun '12), the Google Drive installer would not allow me to designate the remote network share as the "target" folder. (I chose Google Drive over Drobbox et al. as we already use GMail for corporate mail) The next idea was to use a designated workstation at head office to mirror the network share to a local drive, then use Google Drive to push that to the cloud. This seems a step too far. Nor do I have any good ideas about how to achieve this network/local mirroring, as we can't, for example, install the rsync daemon on the server. I do not want to use Google Drive locally on each workstation as this will inconvenience users, and more importantly, move files off the backed-up, well-maintained (UPS, RAID etc) network share at head office. Our budget is only in the £100's. Should we perhaps just ditch the head office server and use something like JungleDisk? At least this presents the user with what appears to be a mapped drive.

    Read the article

  • Why isn't the scripts in my autoload folder being executed in Vim?

    - by Codemonkey
    I'm trying to use Pathogen to manage my Vim extensions. My bundle folder looks like this: .../bundle/ +-- vim-pathogen ¦   +-- autoload ¦   +-- pathogen.vim +-- vim-smoothscroll +-- autoload +-- smooth_scroll.vim And my vimrc file includes this: let s:root = fnamemodify(resolve(expand(":p")), ":h") " Initiate pathogen. exec "source " . s:root . "/vimfiles/bundle/vim-pathogen/autoload/pathogen.vim" exec pathogen#infect() My vimrc file is a symlink located in ~ but pointing to a folder inside my Dropbox folder. This appears to work when I start Vim. Pathogen has added vim-smoothscroll to my runtimepath: :set runtimepath? runtimepath=~/Dropbox/Personal/config_sync/vim/vimfiles,~/Dropbox/Personal/config_sync/vim/vimfiles/bundle/vim-p athogen,~/Dropbox/Personal/config_sync/vim/vimfiles/bundle/vim-smoothscroll,~/.vim,~/vim/share/vim/vimfiles,~/vim/ share/vim/vim74,~/vim/share/vim/vimfiles/after,~/.vim/after The problem is that the script smooth_scroll.vim hasn't been loaded: 1: ~/.vimrc 2: ~/Dropbox/Personal/config_sync/vim/vimfiles/bundle/vim-pathogen/autoload/pathogen.vim 3: ~/vim/share/vim/vim74/syntax/syntax.vim 4: ~/vim/share/vim/vim74/syntax/synload.vim 5: ~/vim/share/vim/vim74/syntax/syncolor.vim 6: ~/vim/share/vim/vim74/filetype.vim 7: ~/vim/share/vim/vim74/menu.vim 8: ~/vim/share/vim/vim74/autoload/paste.vim 9: ~/Dropbox/Personal/config_sync/vim/vimfiles/colors/codeschool.vim 10: ~/Dropbox/Personal/config_sync/vim/_vimrc_gui 11: ~/Dropbox/Personal/config_sync/vim/_vimrc_keybinds 12: ~/vim/share/vim/vim74/plugin/getscriptPlugin.vim 13: ~/vim/share/vim/vim74/plugin/gzip.vim 14: ~/vim/share/vim/vim74/plugin/matchparen.vim 15: ~/vim/share/vim/vim74/plugin/netrwPlugin.vim 16: ~/vim/share/vim/vim74/plugin/rrhelper.vim 17: ~/vim/share/vim/vim74/plugin/spellfile.vim 18: ~/vim/share/vim/vim74/plugin/tarPlugin.vim 19: ~/vim/share/vim/vim74/plugin/tohtml.vim 20: ~/vim/share/vim/vim74/plugin/vimballPlugin.vim 21: ~/vim/share/vim/vim74/plugin/zipPlugin.vim 22: ~/vim/share/vim/vim74/syntax/ruby.vim 23: ~/vim/share/vim/vim74/syntax/vim.vim 24: ~/vim/share/vim/vim74/syntax/python.vim Why is that? Loading the script manually works fine.

    Read the article

  • "Network Error - 53" while trying to mount NFS share in Windows Server 2008 client

    - by Mike B
    CentOS | Windows 2008 I've got a CentOS 5.5 server running nfsd. On the Windows side, I'm running Windows Server 2008 R2 Enterprise. I have the "Files Services" server role enabled and both Client for NFS and Server for NFS are on. I'm able to successfully connect/mount to the CentOS NFS share from other linux systems but am experiencing errors connecting to it from Windows. When I try to connect, I get the following: C:\Users\fooadmin>mount -o anon 10.10.10.10:/share/ z: Network Error - 53 Type 'NET HELPMSG 53' for more information. (IP and share name have been changed to protect the innocent :-) ) Additional information: I've verified low-level network connectivity between the Windows client and the NFS server with telnet (to the NFS on TCP/2049) so I know the port is open. I've further confirmed that inbound and outbound firewall ports are present and enabled. I came across a Microsoft tech note that suggested changing the "Provider Order" so "NFS Network" is above other items like Microsoft Windows Network. I changed this and restarted the NFS client - no luck. I've confirmed that the share folder on the NFS server is readable/writable by all (777) I've tried other variations of the mount command like: mount 10.10.10.10:/share/ z: and mount 10.10.10.10:/share z: and mount -o anon mtype=hard \\10.10.10.10:/share * No luck. As per the command output, I tried typing NET HELPMSG 53 but that doesn't tell me much. Just "The network path was not found". I'm lost on how to proceed with troubleshooting. Any ideas?

    Read the article

  • IIS7 web farm - local or shared content?

    - by rbeier
    We're setting up an IIS7 web farm with two servers. Should each server have its own local copy of the content, or should they pull content directly from a UNC share? What are the pros and cons of each approach? We currently have a single live server WEB1, with content stored locally on a separate partition. A job periodically syncs WEB1 to a standby server WEB2, using robocopy for content and msdeploy for config. If WEB1 goes down, Nagios notifies us, and we manually run a script to move the IP addresses to WEB2's network interface. Both servers are actually VMs running on separate VMWare ESX 4 hosts. The servers are domain-joined. We have around 50-60 live sites on WEB1 - mostly ASP.NET, with a few that are just static HTML. Most are low-traffic "microsites". A few have moderate traffic, but none are massive. We'd like to change this so both WEB1 and WEB2 are actively serving content. This is mainly for reliability - if WEB1 goes down, we don't want to have to manually intervene to fail things over. Spreading the load is also nice, but the load is not high enough right now for us to need this. We're planning to configure our firewall to balance traffic across the two servers. It will detect when a server goes down and will send all the traffic to the remaining live server. We're planning to use sticky sessions for now... eventually we may move to SQL Server session state and stateless load balancing. But we need a way for the servers to share content. We were originally planning to move all the content to a UNC share. Our storage provider says they can set up a highly available SMB share for us. So if we go the UNC route, the storage shouldn't be a single point of failure. But we're wondering about the downsides to this approach: We'll need to change the physical paths for each site and virtual directory. There are also some projects that have absolute paths in their web.config files - we'll have to update those as well. We'll need to create a domain user for the web servers to access the share, and grant that user appropriate permissions. I haven't looked into this yet - I'm not sure if the application pool identity needs to be changed to this user, or if there's another way to tell IIS to use this account when connecting to the share. Sites will no longer be able to access their content if there's ever an Active Directory problem. In general, it just seems a lot more complicated, with more moving parts that could break. Our storage provider would create a volume for us on their redundant SAN. If I understand correctly, this SAN volume would be mounted on a VM running in their redundant VMWare environment; this VM would then expose the SMB share to our web servers. On the other hand, a benefit of the shared content approach is that we'd only need to deploy code to one place, and there would never be a temporary inconsistency between multiple copies of the content. This thread is pretty interesting, though some of these people are working at a much larger scale. I've just been discussing content so far, but we also need to think about configuration. I don't know if we can just use DFS replication for the applicationHost.config and other files, or if it's best to use the shared configuration feature with the config on a UNC share. What do you think? Thanks for your help, Richard

    Read the article

  • Use Ubuntu’s Public Folder to Easily Share Files Between Computers

    - by Chris Hoffman
    You’ve probably noticed that Ubuntu comes with a Public folder in your home directory. This folder isn’t shared by default, but you can easily set up several different types of file-sharing to easily share files on your local network. This folder was originally meant for the Personal File Sharing tool, which is no longer included with Ubuntu by default. You can install the Personal File Sharing tool or use Ubuntu’s built-in file-sharing feature to share files. HTG Explains: What Is RSS and How Can I Benefit From Using It? HTG Explains: Why You Only Have to Wipe a Disk Once to Erase It HTG Explains: Learn How Websites Are Tracking You Online

    Read the article

  • How to Share Files Between User Accounts on Windows, Linux, or OS X

    - by Chris Hoffman
    Your operating system provides each user account with its own folders when you set up several different user accounts on the same computer. Shared folders allow you to share files between user accounts. This process works similarly on Windows, Linux, and Mac OS X. These are all powerful multi-user operating systems with similar folder and file permission systems. Windows On Windows, the “Public” user’s folders are accessible to all users. You’ll find this folder under C:\Users\Public by default. Files you place in any of these folders will be accessible to other users, so it’s a good way to share music, videos, and other types of files between users on the same computer. Windows even adds these folders to each user’s libraries by default. For example, a user’s Music library contains the user’s music folder under C:\Users\NAME\as well as the public music folder under C:\Users\Public\. This makes it easy for each user to find the shared, public files. It also makes it easy to make a file public — just drag and drop a file from the user-specific folder to the public folder in the library. Libraries are hidden by default on Windows 8.1, so you’ll have to unhide them to do this. These Public folders can also be used to share folders publically on the local network. You’ll find the Public folder sharing option under Advanced sharing settings in the Network and Sharing Control Panel. You could also choose to make any folder shared between users, but this will require messing with folder permissions in Windows. To do this, right-click a folder anywhere in the file system and select Properties. Use the options on the Security tab to change the folder’s permissions and make it accessible to different user accounts. You’ll need administrator access to do this. Linux This is a bit more complicated on Linux, as typical Linux distributions don’t come with a special user folder all users have read-write access to. The Public folder on Ubuntu is for sharing files between computers on a network. You can use Linux’s permissions system to give other user accounts read or read-write access to specific folders. The process below is for Ubuntu 14.04, but it should be identical on any other Linux distribution using GNOME with the Nautilus file manager. It should be similar for other desktop environments, too. Locate the folder you want to make accessible to other users, right-click it, and select Properties. On the Permissions tab, give “Others” the “Create and delete files” permission. Click the Change Permissions for Enclosed Files button and give “Others” the “Read and write” and “Create and Delete Files” permissions. Other users on the same computer will then have read and write access to your folder. They’ll find it under /home/YOURNAME/folder under Computer. To speed things up, they can create a link or bookmark to the folder so they always have easy access to it. Mac OS X Mac OS X creates a special Shared folder that all user accounts have access to. This folder is intended for sharing files between different user accounts. It’s located at /Users/Shared. To access it, open the Finder and click Go > Computer. Navigate to Macintosh HD > Users > Shared. Files you place in this folder can be accessed by any user account on your Mac. These tricks are useful if you’re sharing a computer with other people and you all have your own user accounts — maybe your kids have their own limited accounts. You can share a music library, downloads folder, picture archive, videos, documents, or anything else you like without keeping duplicate copies.

    Read the article

  • Can I share a TV card?

    - by Boris
    My environment: 2 PCs, a desktop and a laptop, both on Oneiric they are connected together by ethernet wire nfs-common is installed and configured: the desktop is the server a TV tuner card is installed on the desktop, I can watch TV with the software Me-TV It works fine, TV on desktop, and my network too: I share folders thanks to NFS. But I would like more: How can I share my TV tuner card from the desktop and be able to watch TV on the laptop too? If possible I would like a solution that allows me to keep using the software Me-TV, on both PCs. I bet that there is a solution to create a fake TV card on the 2nd PC with xNBD. I'm trying xnbd-server --target /dev/dvb/adapter0/demux0 but I cant make it work. Trying to understand some examples of xNBD command lines, it seems to be meant only for sharing disk player. If someone as ever used xNBD, he's welcome.

    Read the article

  • Temporarily share/deploy a python (flask) application

    - by Jeff
    Goal Temporarily (1 month?) deploy/share a python (flask) web app without expensive/complex hosting. More info I've developed a basic mobile web app for the non-profit I work for. It's written in python and uses flask as its framework. I'd like to share this with other employees and beta testers (<25 people). Ideally, I could get some sort of simple hosting space/service and push regular updates to it while we test and iterate on this app. Think something along the lines of dropbox, which of course would not work for this purpose. We do have a website, and hosting services for it, but I'm concerned about using this resource as our website is mission critical and this app is very much pre-alpha at this point. Options I've researched / considered Self host from local machine/network (slow, unreliable) Purchase hosting space (with limited non-profit resources, I'm concerned this is overkill) Using our current web server / hosting (not appropriate for testing) Thanks very much for your time!

    Read the article

  • Share files - Ubuntu 12.4 and Windows 7 - one network - password not accepted

    - by gotqn
    I ask this question in SuperUser but no one helps me. I hope to get more attention here. I have three computers connected in one network by modem. I want to share files in this network in the most easy way (I have read about solutions using Samba). So, I have three machines: One with Windows 7 One with Windows XP One with Ubuntu 12.04 and I have the following situation: The windows PCs can share files between each other. The windows PCs can see that Ubuntu's one is in the network The PC with Ubuntu can see only the PC with Windows 7, but when I click on a folder it ask to enter the network password and it is not accepting it (I am 100% sure it's the correct one) Is there to fix this situation a little bit - at least to enable the file sharing between the Ubuntu and Windows 7 PCs or I should choose a different approach (please advice).

    Read the article

  • How to share internet connection and making the client accessible over the lan

    - by Dario Silva Moran
    I've a Pc with Ubuntu 14.04 connected to a linkys router through wlan0, and I'd like to share internet connection to an AVR with ethernet port. This is pretty simple if only internet connection is required for the AVR: actually, creating an ethernet connection as "Shared with other computers" and setting up the AVR IP configuration to use DHCP works just fine, but that makes a private class A lan between those two; of course ip addresses are not in the range of the LAN the router is managing. So, I tried with static ip on both sides (Ubuntu eth0 and AVR ip). Tried many combinations, none of them work to provide Internet access to the AVR and at the same time make the AVR accessible over the network through his static ip address (say, 192.168.0.110). Any tips around to share??

    Read the article

  • Share on Facebook does not show thumbnail images

    - by matt_tm
    I have a PHP application which has a "Share on Facebook" button that, On the development server shows the thumbnail images correctly and allows the user to select between them On the live server, it does NOT show the thumbnail images at all. The relevant portion of the .htaccess file is: # Set up caching on media files for 2 days <FilesMatch "\.(gif|jpg|jpeg|png|flv)$"> ExpiresDefault A172800 Header append Cache-Control "public" </FilesMatch> I'm using the exact same set of php files and .htaccess, but the server configuration is different. What could be causing this? Note that the text appears fine. Edit1 We are also doing some URL rewriting related to images in the .htaccess (on both servers): ... RewriteRule ^.*/content/image/(.*)$ content/image/$1 [L] ... RewriteRule ^.*/images/(.*)$ images/$1 [L] ... Would that be somehow making a difference? Images appear fine all throughout the site. (I posted this question earlier as http://stackoverflow.com/questions/4142597/share-on-facebook-does-not-show-thumbnail-images) )

    Read the article

  • Proper fstab entry to mount a samba share in 12.04

    - by JPbuntu
    I am a little confused on the proper fstab entry for a samba share in Ubuntu 12.04 I can get the drive to mount manually by using: sudo mount -t cifs //192.168.2.2/raid_drive /mnt/homeserver -o username=jon,password=password So I tried putting this in fstab: //192.168.2.2/raid_drive /mnt/homeserver cifs username=jon,password=password,iocharset=utf8,mode=0777,dir_mode=07??77 0 0 Which gives me this error in syslog: kernel: [ 2217.925354] CIFS: Unknown mount option mode kernel: [ 2217.936345] CIFS VFS: default security mechanism requested. The default security mechanism will be upgraded from ntlm to ntlmv2 in kernel release 3.3 This guide says to use smbfs although I believe smbfs is deprecated? What is a common fstab configuration for a samba share in Ubuntu 12.04? EDIT: Using the accepted answer below I was initially getting this error message (from dmesg): [ 45.520883] CIFS VFS: Error connecting to socket. Aborting operation [ 45.520990] CIFS VFS: cifs_mount failed w/return code = -115 although it turns out this was due to network connectivity issues, and not related to improper fstab entry.

    Read the article

  • Setting primary internet connection and network on notebook

    - by Francois
    I have installed Ubuntu on a notebook that I have configured to connect to the internet using an Iburst USB modem. This works 100% after a bit of configuring. I now have a desktop pc that I have installed ubuntu on, and would like to connect the two with a router. I bought a router with wifi, and would like to connect my notebook to the other computer using wifi, while still keeping the internet working with the usb modem. The problem is that as soon as the wifi connect, the internet connection dies. Is there a way to force ubuntu to get internet access through the usb modem, but use wifi to connect to the network? I am pretty new to ubuntu so any help would be appreciated. I also have a samsung galaxy tab that I would like to connect to the internet through usb modem via the wifi, so is there also a way to share that internet connection with the other computers on the network? Thanks in advance.....

    Read the article

  • How do you pass or share variables between django views?

    - by Hugh
    Hi, I'm kind of lost as to how to do this: I have some chained select boxes, with one select box per view. Each choice should be saved so that a query is built up. At the end, the query should be run. But how do you share state in django? I can pass from view to template, but not template to view and not view to view. Or I'm truly not sure how to do this. Please help!

    Read the article

  • Now Instagram lets you record and share 15 seconds video with awesome filters

    - by Gopinath
    Instagram is one of the most popular photo sharing applications and it is very popular for amazing filters that turn an ordinary photo in to an incredible one. Today Instagram extended the filters and sharing options to videos. With the latest version of Instagram application for iOS/Android you can record videos, apply filters and share them. Recording and sharing videos on Instagram is much similar to photos and. You can capture videos up to 15 seconds and there are 13 filters to choose for processing. Wondering why the limit is at 15 seconds? Pundits are saying that TV ads are in general 15 seconds and Instagram is preparing for video ads in near future. Anyways within hours of video sharing features, Instagram is flooded with short videos and the Explore section has very interesting ones to browse through. Just like photos, you can share the captured videos to your Twitter, Facebook and other social stream from Instagram.

    Read the article

  • Can an app in opt/extras install themeable icons in /usr/share

    - by Peter Levi
    My application Variety Wallpaper Changer runs from /opt/extras and uses an indicator icon. I would like to make this indicator icon theme-specific. As far as I understand the standard way is to install named icons into /usr/share/icons with xdg-icon-resource at installation time (Am I right about this?). I have two questions regarding this: Variety installs and runs from /opt/extras.ubuntu.com. Is it acceptable for it to install icons in /usr/share using xdg-icon-resource or is there something else I can do to have theme-specific icons without special-casing themes and dynamically selecting the icon in the code? Variety is packaged using Quickly (and I'm myself a newbie at packaging) - how can I configure it to install theme-specific named icons at installation time?

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >