Search Results

Search found 1560 results on 63 pages for 'bob sully'.

Page 45/63 | < Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >

  • Google I/O 2010 - WebM Open Video Playback in HTML5

    Google I/O 2010 - WebM Open Video Playback in HTML5 Google I/O 2010 - WebM Open Video Playback in HTML5 Chrome 101 Kevin Carle, Jim Bankoski, David Mendels (Brightcove), Bob Mason (Brightcove) The new open VP8 codec and WebM file format present exciting opportunities for innovation in HTML5 video. In this session, you'll see WebM playback in action while YouTube and Brightcove engineers show you how to support the format in your own HTML5 site. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 4 0 ratings Time: 40:02 More in Science & Technology

    Read the article

  • Your interesting code tricks/ conventions? [closed]

    - by Paul
    What interesting conventions, rules, tricks do you use in your code? Preferably some that are not so popular so that the rest of us would find them as novelties. :) Here's some of mine... Input and output parameters This applies to C++ and other languages that have both references and pointers. This is the convention: input parameters are always passed by value or const reference; output parameters are always passed by pointer. This way I'm able to see at a glance, directly from the function call, what parameters might get modified by the function: Inspiration: Old C code int a = 6, b = 7, sum = 0; calculateSum(a, b, &sum); Ordering of headers My typical source file begins like this (see code below). The reason I put the matching header first is because, in case that header is not self-sufficient (I forgot to include some necessary library, or forgot to forward declare some type or function), a compiler error will occur. // Matching header #include "example.h" // Standard libraries #include <string> ... Setter functions Sometimes I find that I need to set multiple properties of an object all at once (like when I just constructed it and I need to initialize it). To reduce the amount of typing and, in some cases, improve readability, I decided to make my setters chainable: Inspiration: Builder pattern class Employee { public: Employee& name(const std::string& name); Employee& salary(double salary); private: std::string name_; double salary_; }; Employee bob; bob.name("William Smith").salary(500.00); Maybe in this particular case it could have been just as well done in the constructor. But for Real WorldTM applications, classes would have lots more fields that should be set to appropriate values and it becomes unmaintainable to do it in the constructor. So what about you? What personal tips and tricks would you like to share?

    Read the article

  • SharePoint 2010, Cloud, and the Constitution

    - by Michael Van Cleave
    The other evening an article on the Red Tap Chronicles caught my eye. The article written by Bob Sullivan titled "The Constitutional Issues of Cloud Computing" was very interesting in regards to the direction most of the technical world is going. We all have been inundated about utilizing cloud computing for reasons of price, availability, or even scalability; but what Bob brings up is a whole separate view of why a business might not want to move toward the cloud for services or applications. The overall point to the article was pretty simple. It all boiled down to the summation that hosting "Things" in the cloud (Email, Documents, etc…) are interpreted differently under the law regarding constitutional search and seizure than say a document or item that is kept in physical form at a business or home. Where if you physically have it stored someone would have to get a warrant to search for it or seize it, but if it is stored off in the cloud and the ISV or provider is subpoenaed for the item then they will usually give access to the information. Obviously this is a big difference in interpretation of the law and the constitution due to technology. So you might ask "Where does this fit in with SharePoint? Well the overall push for this next version of SharePoint is one that gives a business ultimate flexibility to utilize the Cloud. In one example this upcoming version gracefully lends itself to Multi Tenancy so that online or "Cloud" hosting would be possible by Service Providers. Another aspect to the upcoming version is that it has updated its ability to store content outside of the database and in a cheaper commoditized storage facility. This is called Remote Blob Storage (or RBS) which is the next evolution of External Blob Storage (or EBS). With this new functionality that business might look forward to it is extremely important for them to understand that they might be opening themselves up to laws that do not need a warrant to search or seize their information that is stored in the cloud. It will be interesting to see how this all plays out in the next few months. Usually the laws change slowly in comparison to technology so it might be a while until we see if it is actually constitutional to treat someone's content on the cloud differently as it would be in their possession, however until there is some type of parity that happens or more concrete laws regarding the differences be very careful about what you put in the cloud. Michael

    Read the article

  • How to rotate FBX files in 90 degree while running on a path in iTween in unity 3d

    - by Jack Dsilva
    I am doing one racing game,in which I used iTween path systems to smooth camera turning in turns,iTween path systems works fine(special thanks to Bob Berkebile) Here first I used one cube to follow path and it works fine in turning But my problem is instead of using cube I used FBX(character) to follow path here when turn comes character will not move This is my problem Image: I want this type: How to Slove this problem?

    Read the article

  • NHibernate Tutorial #5 - Working with Many to Many relationships

    - by BobPalmer
    After a short break last week, I wanted to make sure I made time to publish the next in my series of tutorials on NHibernate. This week I'll be covering Many to Many relationships, the hilo algorithm, IdBag element, and touch on Lazy Loading. You can view the entire article at this link: http://docs.google.com/Doc?docid=0AUP-rKyyUMKhZGczejdxeHZfMjZkdjd3cjJnMg&hl=en As always, feedback and any technical bits I may have missed are always appreciated! -Bob Palmer

    Read the article

  • Free HTML5 & CSS3 Fundamentals course

    - by TATWORTH
    Originally posted on: http://geekswithblogs.net/TATWORTH/archive/2013/10/13/free-html5--css3-fundamentals-course.aspxAt http://www.microsoftvirtualacademy.com/training-courses/html5-css3-fundamentals-development-for-absolute-beginners there is a free course on HTML5 & CSS3 FundamentalsThis is not a course for pretty web design but for writing good standards compliant HTML. Please note that to get the work files for the course you need to go to http://channel9.msdn.com/Series/HTML5-CSS3-Fundamentals-Development-for-Absolute-Beginners/Series-Introduction-01 as the Microsoft Academy downloads do not seem to work!The course is done by Bob Tabor who runs http://www.learnvisualstudio.net

    Read the article

  • NHibernate tutorial #6 - Parent-Child Relationships

    - by BobPalmer
    I've finally had a chance to continue my NHibernate tutorial series after a series of vacations and events.  In this tutorial, I cover one of the most common relationships, that of the parent-child, in NHibernate.  I also go through some optimization refactoring along the way. You can view the entire Google Docs article here: http://docs.google.com/Doc?docid=0AUP-rKyyUMKhZGczejdxeHZfMzBmdjdzZDlkaA&hl=en   As always, feedback is appreciate! -Bob

    Read the article

  • My Doors - Why Standards Matter to Business

    - by [email protected]
    By Brian Dayton on April 8, 2010 9:27 PM "Standards save money." "Standards accelerate projects." "Standards make better solutions." What do these statements mean to you? You buy technology solutions like Oracle Applications but you're a business person--trying to close the quarter, get performance reviews processed, negotiate a new sourcing contract, etc. When "standards" come up in presentations and discussions do you: - Nod your head politely - Tune out and check your smart phone - Turn to your IT counterpart and say "Bob's all over this standards thing, right Bob?" Here's why standards matter. My wife wants new external doors downstairs, ones that would get more light into the rooms. Am I OK with that? "Uhh, sure...it's a little dark in the kitchen." - 24 hours ago - wife calls to tell me that she's going to the hardware store and may look at doors - 20 hours ago - wife pulls into driveway, informs me that two doors are in the back of her station wagon, ready for me to carry - 19 hours ago - I re-discovered the fact that it's not fun to carry a solid wood door by myself - 5 hours ago - Local handyman, who was at our house anyway, tells me that the doors we bought will likely cost 2-3x the material cost in installation time and labor...the doors are standard but our doorways aren't We could have done more research. I could be more handy. Sure. But the fact is, my 1951 house wasn't built with me in mind. They built what worked and called it a day. The same holds true with a lot of business applications. They were designed and architected for one-time use with one use-case in mind. Today's business climate is different. If you're going to use your processes and technology to differentiate your business you should have at least a working knowledge of: - How standards can benefit your business - Your IT organization's philosophy around standards - Your vendor's track-record around standards...and watch for those who pay lip-service to standards but don't follow through The rallying cry in most IT organizations today is "learn more about the business, drop the acronyms." I'm not advocating that you go out and learn how to code in Java. But I do believe it will help your business and your decision-making process if you meet IT ½...even ¼ of the way there. Epilogue: The door project has been put on hold and yours truly has to return the doors to the hardware store tomorrow.

    Read the article

  • Interested in going to the cloud then this might be useful

    - by simonsabin
    Bob Duffy is doing an afternoon seminar on Azure. It will provide an introduction to the Azure platform, and in particular SQL Azure, show tools and methodologies to migrate on premise databases into the cloud, using a sample application and database and finally it will detail some of the Azure specific features that enable massive scale OLTP solutions such as federations. http://www.prodata.ie/Events/2012/SQL-Azure_and_the_Cloud.aspx...(read more)

    Read the article

  • T-SQL: Why “It Depends”

    Why does everyone use "it depends" as an answer to many T-SQL questions? Bob Hovious brings us a short example of how performance can change based on data loads for the same code.

    Read the article

  • Implementing a Repository with NHibernate - Quickstart with NHibernate (part 2)

    - by BobPalmer
    This is the second in a series of tutorials I am working on to help developers quickly get up to speed with NHibernate.  In this tutorial, I'll be focusing on an implementation of a repository pattern. As always, comments, suggestions, and any technical bits I may have missed are always appreciated! You can view the entire article via this Google Docs link: http://docs.google.com/Doc?docid=0AUP-rKyyUMKhZGczejdxeHZfMTVjMnBqYjVnNw&hl=en Enjoy! -Bob

    Read the article

  • The open-source entrepreneur

    <b>BBC News: </b>"Bob Young is a self-confessed contrarian with a strong desire to change the world by allowing people to share and collaborate. The approach has served him well and has helped turn the Canadian into a multi-millionaire."

    Read the article

  • Oracle Solaris 11 pkg fix

    - by Larry Wake
    Bob Netherton explains why Solaris 11 pkg fix is his new friend. "So far so good. Then comes an oops... This is where you generally say a few things to yourself, and then promise to quit deleting configuration files and directories when you don't know what you are doing. Then you recall that the new Solaris 11 packaging system has some ability to correct common mistakes (like the one I just made)." [Read More]

    Read the article

  • Can a wifi AP act as a client, and a server at the same time?

    - by nbolton
    I feel this is SF worthy (as opposed to SU) as I go into a bit of detail on gateways/routing. Here's my ideal setup (if possible) -- there is a wifi network (lets call it bob's) with which I want access to, but I have a few other computers on my network which I want to keep behind a firewall. So I was thinking of buying a wireless access point so that I could set it up to connect to bob's network from the AP, and then from my server, connect to the AP via ethernet. So that's the first bit. Second part is that I want to have my own private wifi network off the back of this; can I then tell the AP to serve a new network called foobar. When I say private network, I mean that my server is actually a Debian linux install with routing configured (and I also do some QoS stuff on, etc). So ideally, I'd like all the clients on the private network to be behind the server in terms of routing. However, if the private clients connect to the server via wifi, then aren't they exposed to the "public" network? That is, if someone is savvy enough to scan for my IP range. Also, to do routing I'd need to connect two ethernet cables between the server and the AP (because you can't do routing/QoS on virtual devices) -- which isn't a problem really; but I'm not sure whether the AP will allow me to separate the public and private LANs. Or, as well as the AP, am I better getting a wifi-to-ethernet adapter for the server? I could use a wifi usb, but this can be tricky to set up on headless linux; plus the signal strength is a bit lousy. If this question is a bit vague/spurious in places, please comment and I will explain in more detail.

    Read the article

  • MOSS2007 tries to use ActiveDirectory when I have configured an alternative membership provider

    - by glenatron
    I've got a MOSS site that I am trying to configure using Forms authentication and absolutely any kind of membership provider whatsoever. Thus far ActiveDirectory has proved obstructively difficult so I've just whipped up a simple stub membership provider and put it in the GAC. It's a very basic and simple provider but it works fine with an ASP.Net site, I just can't make it work with Sharepoint. On Sharepoint I get the following error when I look for StubProvider:Bob ( or anything else for that matter) from the "Policy For Web Application" people picker: Error in searching user 'StubProvider:bob' : System.ComponentModel.Win32Exception: Unable to contact the global catalog server at Microsoft.SharePoint.Utilities.SPActiveDirectoryDomain.GetDirectorySearcher() at Microsoft.SharePoint.WebControls.PeopleEditor.SearchFromGC(SPActiveDirectoryDomain domain, String strFilter, String[] rgstrProp, Int32 nTimeout, Int32 nSizeLimit, SPUserCollection spUsers, ArrayList& rgResults) at Microsoft.SharePoint.Utilities.SPUserUtility.SearchAgainstAD(String input, SPActiveDirectoryDomain domainController, SPPrincipalType scopes, SPUserCollection usersContainer, Int32 maxCount, String customQuery, TimeSpan searchTimeout, Boolean& reachMaxCount) at Microsoft.SharePoint.Utilities.SPActiveDirectoryPrincipalResolver.SearchPrincipals(String input, SPPrincipalType scopes, SPPrincipalSource sources, SPUserCollection usersContainer, Int32 maxCount, Boolean& reachMaxCount) at Microsoft.SharePoint.Utilities.SPUtility.SearchPrincipalFromResolvers(List`1 resolvers, String input, SPPrincipalType scopes, SPPrincipalSource sources, SPUserCollection usersContainer, Int32 maxCount, Boolean& reachMaxCount, Dictionary`2 usersDict). The Provider is named as Authentication Provider for the Site Collection in question. As far as I can tell this is because Sharepoint is still trying to access ActiveDirectory rather than talking to the provider I'm asking it to use. My Sharepoint Central Administration section includes this: <membership> <providers> <add name="StubProvider" type="StubMembershipProvider.Provider, StubMembershipProvider, Version=1.0.0.0, Culture=neutral, PublicKeyToken=5bd7e2498c3e1a03" /> </providers> </membership> And also: <PeoplePickerWildcards> <clear /> <add key="StubProvider" value="%" /> </PeoplePickerWildcards> Is there a clear reason why this would not be accessible from the PeoplePicker or why it is still trying to use ActiveDirectory? I've made sure I reset IIS and even restarted the server to see if either of those helped but they made no difference.

    Read the article

  • MOSS2007 tries to use ActiveDirectory when I have configured an alternative membership provider

    - by glenatron
    I've got a MOSS site that I am trying to configure using Forms authentication and absolutely any kind of membership provider whatsoever. Thus far ActiveDirectory has proved obstructively difficult so I've just whipped up a simple stub membership provider and put it in the GAC. It's a very basic and simple provider but it works fine with an ASP.Net site, I just can't make it work with Sharepoint. On Sharepoint I get the following error when I look for StubProvider:Bob ( or anything else for that matter) from the "Policy For Web Application" people picker: Error in searching user 'StubProvider:bob' : System.ComponentModel.Win32Exception: Unable to contact the global catalog server at Microsoft.SharePoint.Utilities.SPActiveDirectoryDomain.GetDirectorySearcher() at Microsoft.SharePoint.WebControls.PeopleEditor.SearchFromGC(SPActiveDirectoryDomain domain, String strFilter, String[] rgstrProp, Int32 nTimeout, Int32 nSizeLimit, SPUserCollection spUsers, ArrayList& rgResults) at Microsoft.SharePoint.Utilities.SPUserUtility.SearchAgainstAD(String input, SPActiveDirectoryDomain domainController, SPPrincipalType scopes, SPUserCollection usersContainer, Int32 maxCount, String customQuery, TimeSpan searchTimeout, Boolean& reachMaxCount) at Microsoft.SharePoint.Utilities.SPActiveDirectoryPrincipalResolver.SearchPrincipals(String input, SPPrincipalType scopes, SPPrincipalSource sources, SPUserCollection usersContainer, Int32 maxCount, Boolean& reachMaxCount) at Microsoft.SharePoint.Utilities.SPUtility.SearchPrincipalFromResolvers(List`1 resolvers, String input, SPPrincipalType scopes, SPPrincipalSource sources, SPUserCollection usersContainer, Int32 maxCount, Boolean& reachMaxCount, Dictionary`2 usersDict). The Provider is named as Authentication Provider for the Site Collection in question. As far as I can tell this is because Sharepoint is still trying to access ActiveDirectory rather than talking to the provider I'm asking it to use. My Sharepoint Central Administration section includes this: <membership> <providers> <add name="StubProvider" type="StubMembershipProvider.Provider, StubMembershipProvider, Version=1.0.0.0, Culture=neutral, PublicKeyToken=5bd7e2498c3e1a03" /> </providers> </membership> And also: <PeoplePickerWildcards> <clear /> <add key="StubProvider" value="%" /> </PeoplePickerWildcards> Is there a clear reason why this would not be accessible from the PeoplePicker or why it is still trying to use ActiveDirectory? I've made sure I reset IIS and even restarted the server to see if either of those helped but they made no difference.

    Read the article

  • SFTP, Chroot problems on Redhat

    - by Curtis_w
    I'm having problems setting up sftp with a ChrootDirectory. I've done an equivalent setup on other distros, but for some reason I cannot get it to work on a Redhat AMI. The changes to my sshd_config file are: Subsystem sftp internal-sftp Match Group ftponly PasswordAuthentication yes X11Forwarding no ChrootDirectory %h ForceCommand internal-sftp AllowTcpForwarding no I have the concerned usere's homes at /home/user, owned by root. After connecting with a user in the ftponly group, I'm dropped into / without permissions for anything, and am unable to do anything. sftp bob@localhost Connecting to localhost... bob@localhost's password: sftp> pwd Remote working directory: / I can connect normally with users not in the ftponly group. openssh version 5.3 I've experimented with different permissions, as well as having users own their own home directory (gives a Write failed: Broken pipe error), and so far, nothing has seemed to work. I'm sure it's a permissions error, or something equally as trivial, but at this point my eyes are beginning to glaze over, and any help would be greatly appreciated. EDIT: James and Madhatter, thanks for clarifying. I was confused by chroot dropping me in /... just didn't think through it properly. I've added the appropriate directories and permissions to get read access. One other key part was enabling write access to chrooted homes: setsebool -P ssh_chroot_rw_homedirs on in order to get write access. I think I'm all set now. Thanks for the help.

    Read the article

  • converting to MXF using ffmpeg

    - by Prakash
    I have been trying to use FFmpeg utility to convert a avi file using DNxHD to mxf format. I am using "FFmpeg" with params as following: ffmpeg -i ccvt_box.avi -vcodec dnxhd -video_size 1920x1080 -r 24 -b:v 115m ex.mxf The error it is giving : ffmpeg version N-43737-g76c3fff Copyright (c) 2000-2012 the FFmpeg developers built on Aug 20 2012 18:50:42 with llvm-gcc 4.2.1 (LLVM build 2336.11.00) configuration: libavutil 51. 70.100 / 51. 70.100 libavcodec 54. 53.100 / 54. 53.100 libavformat 54. 25.104 / 54. 25.104 libavdevice 54. 2.100 / 54. 2.100 libavfilter 3. 11.101 / 3. 11.101 libswscale 2. 1.101 / 2. 1.101 libswresample 0. 15.100 / 0. 15.100 Input #0, avi, from 'ccvt_box.avi': Duration: 00:00:10.00, start: 0.000000, bitrate: 691 kb/s Stream #0:0: Video: indeo5 (IV50 / 0x30355649), yuv410p, 340x344, 10 tbr, 10 tbn, 10 tbc Metadata: title : bob.avi [dnxhd @ 0x7fcd60818e00] video parameters incompatible with DNxHD Output #0, mxf, to 'ex.mxf': Stream #0:0: Video: dnxhd, yuv422p, 340x344, q=2-1024, 90k tbn, 24 tbc Metadata: title : bob.avi Stream mapping: Stream #0:0 -> #0:0 (indeo5 -> dnxhd) Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height

    Read the article

  • windows xp cannot access admin share

    - by barlop
    I have 3 systems. A,B,Compx all on xp. but comps A and B have an issue with Compx. Compx has network shares I can access. I can do \\compx and get some. But I cannot access the admin share c$ \\compx\c$ gives a login prompt, and I can't get any user/pass to work. I looked at permissions but don't see an issue. Nevertheless, I will describe what I see in the permissions. In the security tab of C, I have Administrators,creator owner,everyone,bob,system,users (6 things there) "creator owner" has nothing ticked, I can't seem to change that. If I tick so they all get ticked, and click apply, 2.5min and it's completed its opration and they all untick. Though this isn't the root of the problem. Since I get the same in the share I can access. In advanced, I see those 6 things, Administrators,creator owner,everyone,bob,system,users (6 things there) all "full control" all are "this folder, subfolders and files".. except creator owner, which is just subfolders and files only I look at the properties for the share I can see. looks the same, except in security..advanced, double clicking any of them the boxes are all ticked but greyed. That's not the problem though since I can access that share. So, I don't know what the problem is.

    Read the article

  • windows xp cannot access admin share

    - by barlop
    I have 3 systems. A,B,Compx all on xp. but comps A and B have an issue with Compx. Compx has network shares I can access. I can do \\compx and get some. But I cannot access the admin share c$ \\compx\c$ gives a login prompt, and I can't get any user/pass to work. I looked at permissions but don't see an issue. Nevertheless, I will describe what I see in the permissions. In the security tab of C, I have Administrators,creator owner,everyone,bob,system,users (6 things there) "creator owner" has nothing ticked, I can't seem to change that. If I tick so they all get ticked, and click apply, 2.5min and it's completed its opration and they all untick. Though this isn't the root of the problem. Since I get the same in the share I can access. In advanced, I see those 6 things, Administrators,creator owner,everyone,bob,system,users (6 things there) all "full control" all are "this folder, subfolders and files".. except creator owner, which is just subfolders and files only I look at the properties for the share I can see. looks the same, except in security..advanced, double clicking any of them the boxes are all ticked but greyed. That's not the problem though since I can access that share. So, I don't know what the problem is.

    Read the article

  • SSH & SFTP: Should I assign one port to each user to facilitate bandwidth monitoring?

    - by BertS
    There is no easy way to track real-time per-user bandwidth usage for SSH and SFTP. I think assigning one port to each user may help. Idea of implementation Use case Bob, with UID 1001, shall connect on port 31001. Alice, with UID 1002, shall connect on port 31002. John, with UID 1003, shall connect on port 31003. (I do not want to lauch several sshd instances as proposed in question 247291.) 1. Setup for SFTP: In /etc/ssh/sshd_config: Port 31001 Port 31002 Port 31003 Subsystem sftp /usr/bin/sftp-wrapper.sh The file sftp-wrapper.sh starts the sftp server only if the port is the correct one: #!/bin/sh mandatory_port=3`id -u` current_port=`echo $SSH_CONNECTION | awk '{print $4}'` if [ $mandatory_port -eq $current_port ] then exec /usr/lib/openssh/sftp-server fi 2. Additional setup for SSH: A few lines in /etc/profile prevents the user from connecting on the wrong port: if [ -n "$SSH_CONNECTION" ] then mandatory_port=3`id -u` current_port=`echo $SSH_CONNECTION | awk '{print $4}'` if [ $mandatory_port -ne $current_port ] then echo "Please connect on port $mandatory_port." exit 1 fi fi Benefits Now it should be easy to monitor per-user bandwidth usage. A Rrdtool-based application could produce charts like this: I know this won't be a perfect calculation of the bandwidth usage: for example, if somebody launches a bruteforce attack on port 31001, there will be a lot of traffic on this port although not from Bob. But this is not a problem to me: I do not need an exact computation of per-user bandwidth usage, but an indicator that is approximately correct in standard situations. Questions Is the idea of assigning one port for each user is a good one? Is the proposed setup an reliable one? If I have to open dozens of ports for many users, should I expect a performance drawback? Do you know a rrdtool-based application which could make the chart above?

    Read the article

  • Crossover LAN connection between Ubuntu And Windows 7 is not working

    - by brett
    my question is closely related to: How do I connect Ubuntu 10.04 and Windows 7 with an Ethernet cable? What I am after is: Windows 7-------wireless-----\ Wifi router Ubuntu 10.04----wireless-----/ Windows 7-------wireless-----\ | cross_over_cable Wifi router | Ubuntu 10.04----wireless-----/ What I did was On Windows edit system32\drivers\etc\hosts Add the following line: 192.168.253.2 my_ubuntu_computer_name_&-wired //?not sure if this is right On Ubuntu: sudo gedit /etc/hosts Add the following line: 192.168.253.1 my_pc_computer_name&-wired //?not sure if this is right and then Ubuntu 12.04 as the host Right click on the Network Manager applet, click Edit Connections... In the Wired tab, click Auto eth0, then click Edit... In the IPv4 Settings tab, change Method: to Shared to other computers. Click Apply and enter your password when it asks you. Close everything and reboot. Plug the Ethernet cable into both computers. But, I can connect to my windows network folders from ubuntu via wifi I can't connect to my ubuntu network folders from windows via wifi(in fact this bit was working before - so my wifi connection is worse) my ubuntu Auto Ethernet seems to be on From Ubuntu eth0 Link encap:Ethernet HWaddr 00:11:2f:f3:43:8d inet addr:10.42.0.1 Bcast:10.42.0.255 Mask:255.255.255.0 inet6 addr: fe80::211:2fff:fef3:438d/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:172 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 B) TX bytes:27279 (27.2 KB) Interrupt:19 Base address:0xe400 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:1147 errors:0 dropped:0 overruns:0 frame:0 TX packets:1147 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:94380 (94.3 KB) TX bytes:94380 (94.3 KB) wlan0 Link encap:Ethernet HWaddr 00:03:c9:e9:6f:bf inet addr:10.1.1.7 Bcast:10.1.1.255 Mask:255.255.255.0 inet6 addr: fe80::203:c9ff:fee9:6fbf/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:13186 errors:0 dropped:0 overruns:0 frame:0 TX packets:12187 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:1598882 (1.5 MB) TX bytes:1189555 (1.1 MB) From Windows: Windows IP Configuration Ethernet adapter Bluetooth Network Connection: Media State . . . . . . . . . . . : Media disconnected Connection-specific DNS Suffix . : Wireless LAN adapter Wireless Network Connection: Connection-specific DNS Suffix . : BoB Link-local IPv6 Address . . . . . : fe80::ecf7:c445:3725:b9c1%12 IPv4 Address. . . . . . . . . . . : 10.1.1.4 Subnet Mask . . . . . . . . . . . : 255.255.255.0 Default Gateway . . . . . . . . . : 10.1.1.1 Tunnel adapter Local Area Connection* 15: Connection-specific DNS Suffix . : IPv6 Address. . . . . . . . . . . : 2001:0:4137:9e76:1423:3ae3:f5fe:fefb Link-local IPv6 Address . . . . . : fe80::1423:3ae3:f5fe:fefb%23 Default Gateway . . . . . . . . . : :: Tunnel adapter isatap.BoB: Media State . . . . . . . . . . . : Media disconnected Connection-specific DNS Suffix . : BoB Tunnel adapter isatap.{D0C8EBA1-335D-4620-8570-6C36E8786D72}: Media State . . . . . . . . . . . : Media disconnected Connection-specific DNS Suffix . :

    Read the article

  • UDP blocked by Windows XP Firewall when sending to local machine

    - by user36367
    I work for a software development company but the issue doesn't seem to be programming-related. Here is my setup: Windows XP Professional with Service Pack 3, all updated Program that sends UDP datagrams Program that receives UDP datagrams Windows Firewall set to allow inbound UDP datagrams on a specific port (Scope: Subnet) If I send a UDP datagram on any port to other, similar machines, it goes through. If I send the UDP datagram to the same computer running the program that sends (whether using broadcast, localhost IP or the specific IP of the machine), the receiver program gets nothing. I've traced the problem down to the Windows XP Firewall, as Windows 7 does not have this problem (and I do not wish to sully my hands with Vista). If the exception I create for that UDP port in the WinXP firewall is set for a Scope of Subnet the datagram is blocked, but if I set it to All Computers or specifically enter my network settings (192.168.2.161 or 192.168.2.0/255.255.255.0) it works fine. Using different UDP ports makes no difference. I've tried different programs to reproduce this problem (ServerTalk to send and either IP Port Spy or PortPeeker to receive) to make sure it's not our code that's the issue, and those programs' datagrams were blocked as well. Also, that computer only has one network interface, so there are no additional network weirdness. I receive my IP from a DHCP server, so this is a straightforward setup. Given that it doesn't happen in Windows 7 I must assume it's a defect in the Windows XP Firewall, but I'd think someone else would have encountered this problem before. Has anyone encountered anything like this? Any ideas?

    Read the article

  • UDP blocked by Windows XP Firewall when sending to local machine

    - by user36367
    Hi there, I work for a software development company but the issue doesn't seem to be programming-related. Here is my setup: - Windows XP Professional with Service Pack 3, all updated - Program that sends UDP datagrams - Program that receives UDP datagrams - Windows Firewall set to allow inbound UDP datagrams on a specific port (Scope: Subnet) If I send a UDP datagram on any port to other, similar machines, it goes through. If I send the UDP datagram to the same computer running the program that sends (whether using broadcast, localhost IP or the specific IP of the machine), the receiver program gets nothing. I've traced the problem down to the Windows XP Firewall, as Windows 7 does not have this problem (and I do not wish to sully my hands with Vista). If the exception I create for that UDP port in the WinXP firewall is set for a Scope of Subnet the datagram is blocked, but if I set it to All Computers or specifically enter my network settings (192.168.2.161 or 192.168.2.0/255.255.255.0) it works fine. Using different UDP ports makes no difference. I've tried different programs to reproduce this problem (ServerTalk to send and either IP Port Spy or PortPeeker to receive) to make sure it's not our code that's the issue, and those programs' datagrams were blocked as well. Also, that computer only has one network interface, so there are no additional network weirdness. I receive my IP from a DHCP server, so this is a straightforward setup. Given that it doesn't happen in Windows 7 I must assume it's a defect in the Windows XP Firewall, but I'd think someone else would have encountered this problem before. Has anyone encountered anything like this? Any ideas? Thanks in advance!

    Read the article

< Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >