Search Results

Search found 7490 results on 300 pages for 'algorithm analysis'.

Page 276/300 | < Previous Page | 272 273 274 275 276 277 278 279 280 281 282 283  | Next Page >

  • Data transformation question

    - by tkm
    I have data composed of a list of employers and a list of workers. Each has a many-to-many relationship with the other (so an employer can have many workers, and a worker can have many employers). The way the data is retrieved (and given to me) is as follows: each employer has an array of workers. In other words: employer n has: worker x, worker y etc. So I have a bunch of employer objects each containing an array of workers. I need to transform this data (and basically invert the relationship). I need to have a bunch of worker objects, each containing and array of employers. In other words: worker x has: employer n1, employer n2 etc. The context is hypothetical so please don't comment on why I need this or why I am doing it this way. I would really just like help on the algorithm to perform this transformation (there isn't that much data so I would prefer readability over complex optimizations which reduce complexity). (Oh and I am using Java, but pseudocode would be fine). Thanks!

    Read the article

  • Programmatically Change Colors in C#

    - by Jason
    I am current working on a project where, as different users add text to a document, I would like the color of the text to change. Originally, I was using C#'s predefined color values and just putting the ones I wanted to use into an enum in my application and cycling through the colors as different users added annotations. This works fine, and I am okay with this solution. However, I also could have chosen to change the RGB values and derived colors that way. I'm curious about what type of algorithm would be good to change those values to get different sets of colors. This is more just an exercise of something I had thought about. To clarify a little bit, I don't want to just increment one of the color values (R, G, or B) because that wouldn't give me enough variety in my colors. But, I don't think it would also work to increment all three of equal amounts. I also have to watch out for repeating colors (up to a point). The requirements for my project anticipates, at most, 10 different reviewers.

    Read the article

  • How to terminate a request in JSP (not the "return;")

    - by Genom
    I am programming a website with JSP. There are pages where user must be logged in to see it. If they are not logged in, they should see a login form. I have seen in a php code that you can make a .jsp page (single file), which checkes, whether the user is logged in or not. If not it will show the login form. If the user is logged in, nothing will be done. So in order to do that I use this structure in my JSPs: Headers, menus, etc. etc... normal stuff which would be shown such as body, footer to a logged in user. This structure is very easy to apply to all webpages. So I don't have to apply checking algorithm to each webpage! I can simply add this "" and the page is secure! So my problem is that if the user is not logged in, then only the log in form should be shown and the footer. So code should bypass the body. Therefore structured my checklogin.jsp so: If user is not logged in show the login form and footer and terminate request. The problem is that I don't know how to terminate the request... If I use "return;" then only the checklogin.jsp stops but server continues to process parent page! Therefore page has 2 footers! (1 from parent page and 1 from checklogin.jsp). How can I avoid this? (There is exit(); in php for this by the way!) Thanks for any suggestions!

    Read the article

  • How can I provide property-level string formatting?

    - by jhubsharp
    Not sure if the question wording is accurate, but here's what I want to do: I want to databind a class with some strings in it: class MyClass { public string MyProperty { get; set; } public string MyProperty2 { get; set; } } When databinding, everything behaves normally. On the back end, I'm writing to a file and I want MyProperty2 to always be encrypted using some encryption algorithm. I want my back-end code to write each string without needing to know that encryption is required (I want the class to know it should be encrypted, not the consumer). Can I do this with a type converter, or something similar? EDIT: There are other scenarios as well. Some booleans I want to format as "Y" or "N", other booleans I want formatted as "Enabled" / "Disabled", etc. I can write (and have written) helper methods and let the file writer call the helper methods as appropriate, I'm just wondering if there's a way to do this without the file writer needing to know which objects need which kind of formatting and let the objects tell that to the file writer.

    Read the article

  • How duplicate an object in a list and update property of duplicated objects ?

    - by user359706
    Hello What would be the best way to duplicate an object placed in a list of items and change a property of duplicated objects ? I thought proceed in the following manner: - get object in the list by "ref" + "article" - Cloned the found object as many times as desired (n times) - Remove the object found - Add the clones in the list What do you think? A concrete example: Private List<Product> listProduct; listProduct= new List<Product>(); Product objProduit_1 = new Produit; objProduct_1.ref = "001"; objProduct_1.article = "G900"; objProduct_1.quantity = 30; listProducts.Add(objProduct_1); ProductobjProduit_2 = new Product; objProduct_2.ref = "002"; objProduct_2.article = "G900"; objProduct_2.quantity = 35; listProduits.Add(objProduct_2); desired method: public void updateProductsList(List<Product> paramListProducts,Produit objProductToUpdate, int32 nbrDuplication, int32 newQuantity){ ... } Calling method example: updateProductsList(listProducts,objProduct_1,2,15); Waiting result: Replace follow object : ref = "001"; article = "G900"; quantite = 30; By: ref = "001"; article = "G900"; quantite = 15; ref = "001"; article = "G900"; quantite = 15; The Algorithm is correct? Would you have an idea of the method implementation "updateProductsList" Thank you in advance for your help.

    Read the article

  • Integrating jQuery autocomplete with Google site search

    - by user1715700
    I have a bit of an odd situation. I have to implement search on a public facing website -but, that search must be able to search both web pages and have an autocomplete/suggestion functionality that comes from a list of terms that are in a DB table. So, I'm wondering a couple things: 1) should I be looking at Google search and jQuery autocomplete? 2) is there something else I should be looking at instead? 3) if this is the right path to be heading down are the enough pointers on implementation? The crux of my problem is that the terms that I need to use for the autocomplete/suggest functionality reside within a database and not on the webpages. So, I thought Google would be appropriate for search the webpages and that I could sort of fill in the blanks so to speak with these terms from the DB. I'm going to say that there are roughly 20-40,000 terms or so that need autocomplete. But that is really just a very rough guess -it could be less. I'm open to ideas and not really married to any particular solution. However, I will admit to liking the ideas of offloading the search to Google. I hear they have a good algorithm ;) Any ideas, thoughts, or leads are greatly appreciated!

    Read the article

  • Infrastructure for high transactional system (language & hosting suggestion help)

    - by RPS
    Some of our friends (University students) are trying to develop a twitter type application, I want to plan for at least 1000 transactions per second (I know it's wishful thinking) for initial launch. This involves several people connecting and getting updates and posting (text + images) to site. In the back end db will server the data and also calculates rankings of what to push to user based on complex algorithm on the fly real-time. Our group is familiar with Java and Tomcat/MySQL. We can also easily learn/code in PHP/MySQL. What is the best suited platform for our purpose ? Though Java seem to be easy to implement for us I am afraid that hosting will be a bit difficult. I could find cloud based php hosting services (like rackspace cloudsites) at reasonable cost. Amazon EC2 is a bit over our heads to manage on day-to-day. Also any recommendation on hosting ? (PHP or Java) We don't have millions in seed money but about $20K to start with. Any advice on above or any thing in general approach is much appreciated.

    Read the article

  • Why this doesnt't work in C++?

    - by user3377450
    I'm doing something and I have this: //main.cpp file template<typename t1, typename t2> std::ostream& operator<<(std::ostream& os, const std::pair<t1, t2>& pair) { return os << "< " << pair.first << " , " << pair.second << " >"; } int main() { std::map<int, int> map = { { 1, 2 }, { 2, 3 } }; std::cout << *map.begin() << std::endl;//This works std::copy(map.begin(), map.end(), std::ostream_iterator<std::pair<int,int> >(std::cout, " "));//this doesn't work } I guess this is not working because in the std::copy algorithm the operator isn't defined, but what can I do?

    Read the article

  • Java getMethod with subclass parameter

    - by SelectricSimian
    I'm writing a library that uses reflection to find and call methods dynamically. Given just an object, a method name, and a parameter list, I need to call the given method as though the method call were explicitly written in the code. I've been using the following approach, which works in most cases: static void callMethod(Object receiver, String methodName, Object[] params) { Class<?>[] paramTypes = new Class<?>[params.length]; for (int i = 0; i < param.length; i++) { paramTypes[i] = params[i].getClass(); } receiver.getClass().getMethod(methodName, paramTypes).invoke(receiver, params); } However, when one of the parameters is a subclass of one of the supported types for the method, the reflection API throws a NoSuchMethodException. For example, if the receiver's class has testMethod(Foo) defined, the following fails: receiver.getClass().getMethod("testMethod", FooSubclass.class).invoke(receiver, new FooSubclass()); even though this works: receiver.testMethod(new FooSubclass()); How do I resolve this? If the method call is hard-coded there's no issue - the compiler just uses the overloading algorithm to pick the best applicable method to use. It doesn't work with reflection, though, which is what I need. Thanks in advance!

    Read the article

  • [PHP] Does unsetting array values during iterating save on memory?

    - by saturn_rising
    Hello fellow code warriors, This is a simple programming question, coming from my lack of knowledge of how PHP handles array copying and unsetting during a foreach loop. It's like this, I have an array that comes to me from an outside source formatted in a way I want to change. A simple example would be: $myData = array('Key1' => array('value1', 'value2')); But what I want would be something like: $myData = array([0] => array('MyKey' => array('Key1' => array('value1', 'value2')))); So I take the first $myData and format it like the second $myData. I'm totally fine with my formatting algorithm. My question lies in finding a way to conserve memory since these arrays might get a little unwieldy. So, during my foreach loop I copy the current array value(s) into the new format, then I unset the value I'm working with from the original array. E.g.: $formattedData = array(); foreach ($myData as $key => $val) { // do some formatting here, copy to $reformattedVal $formattedData[] = $reformattedVal; unset($myData[$key]); } Is the call to unset() a good idea here? I.e., does it conserve memory since I have copied the data and no longer need the original value? Or, does PHP automatically garbage collect the data since I don't reference it in any subsequent code? The code runs fine, and so far my datasets have been too negligible in size to test for performance differences. I just don't know if I'm setting myself up for some weird bugs or CPU hits later on. Thanks for any insights. -sR

    Read the article

  • Generic list/sublist handling

    - by user628661
    Let's say we have a class class ComplexCls { public int Fld1; public string Fld2; //could be more fields } class Cls { public int SomeField; } and then some code class ComplexClsList: List<ComplexCls>; ComplexClsList myComplexList; // fill myComplexList // same for Cls class ClsList : List<Cls>; ClsList myClsList; We want to populate myClsList from myComplexList, something like (pseudocode): foreach Complexitem in myComplexList { Cls ClsItem = new Cls(); ClsItem.SomeField = ComplexItem.Fld1; } The code to do this is easy and will be put in some method in myClsList. However I'd like to design this as generic as possible, for generic ComplexCls. Note that the exact ComplexCls is known at the moment of using this code, only the algorithm shd be generic. I know it can be done using (direct) reflection but is there other solution? Let me know if the question is not clear enough. (probably isn't). [EDIT] Basically, what I need is this: having myClsList, I need to specify a DataSource (ComplexClsList) and a field from that DataSource (Fld1) that will be used to populate my SomeField

    Read the article

  • Spam Activity From my computer

    - by Bnymn
    I'm using Ubuntu 12.04 64bit. I'm using HTTP proxy over ssh as mentioned here. If I do not start TinyProxy, everything is OK. But, when I start TinyProxy, I'm getting the following. I think there is an application running on my machine and watching the proxy to start. But I could not decide which one it could be. ps ax PID TTY STAT TIME COMMAND 1 ? Ss 0:01 /sbin/init 2 ? S 0:00 [kthreadd] 3 ? S 0:00 [ksoftirqd/0] 6 ? S 0:00 [migration/0] 7 ? S 0:00 [watchdog/0] 21 ? S< 0:00 [cpuset] 22 ? S< 0:00 [khelper] 23 ? S 0:00 [kdevtmpfs] 24 ? S< 0:00 [netns] 26 ? S 0:00 [sync_supers] 27 ? S 0:00 [bdi-default] 28 ? S< 0:00 [kintegrityd] 29 ? S< 0:00 [kblockd] 30 ? S< 0:00 [ata_sff] 31 ? S 0:00 [khubd] 32 ? S< 0:00 [md] 34 ? S 0:00 [khungtaskd] 35 ? S 0:00 [kswapd0] 36 ? SN 0:00 [ksmd] 37 ? SN 0:00 [khugepaged] 38 ? S 0:00 [fsnotify_mark] 39 ? S 0:00 [ecryptfs-kthrea] 40 ? S< 0:00 [crypto] 48 ? S< 0:00 [kthrotld] 49 ? S 0:00 [scsi_eh_0] 50 ? S 0:00 [scsi_eh_1] 51 ? S 0:00 [scsi_eh_2] 52 ? S 0:00 [scsi_eh_3] 75 ? S< 0:00 [devfreq_wq] 240 ? S< 0:00 [xfs_mru_cache] 241 ? S< 0:00 [xfslogd] 242 ? S< 0:00 [xfsdatad] 243 ? S< 0:00 [xfsconvertd] 245 ? S 0:00 [xfsbufd/sda3] 246 ? S 0:01 [xfsaild/sda3] 330 ? S 0:00 upstart-udev-bridge --daemon 333 ? Ss 0:00 /sbin/udevd --daemon 472 ? S< 0:00 [cfg80211] 479 ? S< 0:00 [kpsmoused] 671 ? S 0:00 upstart-socket-bridge --daemon 779 ? S 0:00 [xfsbufd/sda4] 781 ? S 0:01 [xfsaild/sda4] 785 ? S< 0:00 [ttm_swap] 800 ? S< 0:00 [hd-audio0] 803 ? S< 0:00 [hd-audio1] 857 ? Sl 0:00 rsyslogd -c5 869 ? Ss 0:04 dbus-daemon --system --fork --activation=upstart 881 ? Ss 0:00 /usr/sbin/modem-manager 883 ? Ss 0:00 /usr/sbin/bluetoothd 905 ? Ssl 0:02 NetworkManager 906 ? Ss 0:00 /usr/sbin/cupsd -F 910 ? Sl 0:02 /usr/lib/policykit-1/polkitd --no-debug 918 ? S 0:00 avahi-daemon: running [bunyamin-hp.local] 919 ? S 0:00 avahi-daemon: chroot helper 920 ? S< 0:00 [krfcommd] 956 ? Ss 0:00 /sbin/wpa_supplicant -B -P /run/sendsigs.omit.d/wpasupplicant.pid -u -s -O /var/run/wpa_supplicant 980 tty4 Ss+ 0:00 /sbin/getty -8 38400 tty4 985 tty5 Ss+ 0:00 /sbin/getty -8 38400 tty5 1000 tty2 Ss+ 0:00 /sbin/getty -8 38400 tty2 1006 tty3 Ss+ 0:00 /sbin/getty -8 38400 tty3 1009 tty6 Ss+ 0:00 /sbin/getty -8 38400 tty6 1024 ? Ss 0:00 acpid -c /etc/acpi/events -s /var/run/acpid.socket 1025 ? Ss 0:00 atd 1026 ? Ss 0:00 cron 1029 ? Ss 0:01 /usr/sbin/irqbalance 1034 ? Ssl 0:00 whoopsie 1091 ? Ssl 0:00 lightdm 1216 tty1 Ss+ 0:00 /sbin/getty -8 38400 tty1 1224 ? Sl 0:00 /usr/lib/accountsservice/accounts-daemon 1241 ? Sl 0:00 /usr/sbin/console-kit-daemon --no-daemon 1356 ? Sl 0:00 /usr/lib/upower/upowerd 1447 ? Sl 0:00 /usr/lib/x86_64-linux-gnu/colord/colord 1539 ? SNl 0:00 /usr/lib/rtkit/rtkit-daemon 1723 ? Sl 0:00 /usr/lib/udisks/udisks-daemon 1724 ? S 0:00 udisks-daemon: not polling any devices 2077 ? Z 0:00 [lightdm] <defunct> 2433 ? Z 0:00 [lightdm] <defunct> 3491 ? S 0:00 [flush-8:0] 4023 ? S 0:00 [kworker/u:14] 4034 ? S 0:00 [migration/1] 4035 ? S 0:00 [kworker/1:3] 4036 ? S 0:00 [ksoftirqd/1] 4037 ? S 0:00 [watchdog/1] 4038 ? S 0:00 [migration/2] 4040 ? S 0:00 [ksoftirqd/2] 4041 ? S 0:00 [watchdog/2] 4042 ? S 0:00 [migration/3] 4043 ? S 0:00 [kworker/3:1] 4044 ? S 0:00 [ksoftirqd/3] 4045 ? S 0:00 [watchdog/3] 4047 ? S 0:00 [irq/43-mei] 4070 ? S 0:00 [kworker/3:0] 4072 ? S 0:00 [kworker/1:0] 4164 ? Ss 0:00 anacron -s 4549 tty7 Ss+ 1:13 /usr/bin/X :0 -auth /var/run/lightdm/root/:0 -nolisten tcp vt7 -novtswitch 4683 ? Sl 0:00 lightdm --session-child 12 47 4718 ? Sl 0:00 /usr/bin/gnome-keyring-daemon --daemonize --login 4729 ? Ssl 0:00 gnome-session --session=gnome-fallback 4765 ? Ss 0:00 /usr/bin/ssh-agent /usr/bin/dbus-launch --exit-with-session gnome-session --session=gnome-fallback 4768 ? S 0:00 /usr/bin/dbus-launch --exit-with-session gnome-session --session=gnome-fallback 4769 ? Ss 0:00 //bin/dbus-daemon --fork --print-pid 5 --print-address 7 --session 4779 ? Sl 0:01 /usr/lib/gnome-settings-daemon/gnome-settings-daemon 4786 ? S 0:00 /usr/lib/gvfs/gvfsd 4788 ? Sl 0:00 /usr/lib/gvfs//gvfs-fuse-daemon -f /home/bunyamin/.gvfs 4797 ? Sl 0:00 /usr/lib/gnome-settings-daemon/gsd-printer 4799 ? Sl 0:03 metacity 4805 ? S 0:00 /usr/lib/x86_64-linux-gnu/gconf/gconfd-2 4811 ? Sl 0:10 gnome-panel 4814 ? S 0:00 syndaemon -i 2.0 -K -R -t 4819 ? S<l 0:00 /usr/bin/pulseaudio --start --log-target=syslog 4821 ? Sl 0:00 /usr/lib/dconf/dconf-service 4826 ? Sl 0:00 /usr/lib/gnome-settings-daemon/gnome-fallback-mount-helper 4828 ? Sl 0:06 nautilus -n 4830 ? Sl 0:02 nm-applet 4832 ? Sl 0:00 /usr/lib/policykit-1-gnome/polkit-gnome-authentication-agent-1 4835 ? Sl 0:00 bluetooth-applet 4851 ? S 0:00 /usr/lib/pulseaudio/pulse/gconf-helper 4854 ? Sl 0:04 /usr/lib/indicator-applet/indicator-applet-complete 4859 ? S 0:00 /usr/lib/gvfs/gvfs-gdu-volume-monitor 4863 ? S 0:00 /usr/lib/gvfs/gvfs-gphoto2-volume-monitor 4865 ? Sl 0:00 /usr/lib/gvfs/gvfs-afc-volume-monitor 4871 ? S 0:00 /usr/lib/gvfs/gvfsd-trash --spawner :1.6 /org/gtk/gvfs/exec_spaw/0 4874 ? Sl 0:00 /usr/lib/indicator-application/indicator-application-service 4876 ? Sl 0:00 /usr/lib/indicator-datetime/indicator-datetime-service 4878 ? Sl 0:00 /usr/lib/indicator-messages/indicator-messages-service 4887 ? Sl 0:00 /usr/lib/indicator-printers/indicator-printers-service 4888 ? Sl 0:00 /usr/lib/indicator-session/indicator-session-service 4889 ? Sl 0:00 /usr/lib/indicator-sound/indicator-sound-service 4906 ? S 0:00 /usr/lib/geoclue/geoclue-master 4929 ? S 0:00 /usr/lib/ubuntu-geoip/ubuntu-geoip-provider 4938 ? Sl 0:11 /usr/lib/gnome-applets/multiload-applet-2 4939 ? Sl 0:01 /usr/lib/gnome-applets/cpufreq-applet 4953 ? S 0:00 /usr/lib/gvfs/gvfsd-metadata 4955 ? S 0:00 /usr/lib/gvfs/gvfsd-burn --spawner :1.6 /org/gtk/gvfs/exec_spaw/1 4957 ? Sl 3:22 /usr/lib/firefox/firefox 4973 ? Sl 0:00 /usr/lib/x86_64-linux-gnu/at-spi2-core/at-spi-bus-launcher 4997 ? Sl 0:00 /usr/lib/gnome-disk-utility/gdu-notification-daemon 5000 ? Sl 0:00 telepathy-indicator 5007 ? Sl 0:00 /usr/lib/telepathy/mission-control-5 5012 ? Sl 0:00 /usr/lib/gnome-online-accounts/goa-daemon 5018 ? Sl 0:00 gnome-screensaver 5019 ? Sl 0:01 zeitgeist-datahub 5025 ? Sl 0:00 /usr/bin/zeitgeist-daemon 5033 ? Sl 0:00 /usr/lib/zeitgeist/zeitgeist-fts 5041 ? S 0:00 /bin/cat 5052 ? Sl 0:08 /usr/bin/gnome-terminal -x /bin/sh -c '/home/bunyamin/Desktop/SSH Tunnel' 5058 ? S 0:00 gnome-pty-helper 5067 ? Sl 0:00 update-notifier 5090 ? S 0:00 /usr/bin/python /usr/lib/system-service/system-service-d 5130 ? Sl 0:00 /usr/lib/deja-dup/deja-dup/deja-dup-monitor 5135 ? S 0:00 /bin/sh -c nice run-parts --report /etc/cron.daily 5136 ? SN 0:00 run-parts --report /etc/cron.daily 5358 pts/4 Ss 0:00 bash 5482 ? S 0:00 [kworker/0:1] 5487 ? S 0:01 [kworker/2:0] 5550 ? Sl 1:15 /usr/lib/firefox/plugin-container /usr/lib/flashplugin-installer/libflashplayer.so -greomni /usr/lib/firefox/omni.ja 4957 true plugin 5717 ? S 0:00 /usr/lib/cups/notifier/dbus dbus:// 5824 ? SN 0:00 /bin/sh /etc/cron.daily/update-notifier-common 5825 ? SN 0:00 /usr/bin/python /usr/lib/update-notifier/package-data-downloader 5872 ? Sl 0:00 /usr/lib/notify-osd/notify-osd 5888 ? S 0:00 /sbin/udevd --daemon 5889 ? S 0:00 /sbin/udevd --daemon 5909 ? S 0:00 /sbin/dhclient -d -4 -sf /usr/lib/NetworkManager/nm-dhcp-client.action -pf /var/run/sendsigs.omit.d/network-manager.dhclient-eth1.pid -lf /var/lib/dhcp/dhclient-f5f0 5912 ? S 0:00 /usr/sbin/dnsmasq --no-resolv --keep-in-foreground --no-hosts --bind-interfaces --pid-file=/var/run/sendsigs.omit.d/network-manager.dnsmasq.pid --listen-address=127. 5975 pts/1 Ss+ 0:00 /bin/sh -c '/home/bunyamin/Desktop/SSH Tunnel' 5976 pts/1 S+ 0:00 /bin/sh /home/bunyamin/Desktop/SSH Tunnel 5977 pts/1 S+ 0:00 ssh -p443 [email protected] -L 8000:127.0.0.1:8000 5980 ? Sl 0:00 /usr/lib/gvfs/gvfsd-http --spawner :1.6 /org/gtk/gvfs/exec_spaw/2 6034 ? S 0:00 [kworker/u:0] 6054 ? S 0:00 [kworker/2:2] 6070 ? S 0:00 [kworker/0:3] 6094 ? Sl 0:02 gedit /home/bunyamin/Desktop/a.html 6101 ? S 0:00 [kworker/0:2] 6130 pts/4 R+ 0:00 ps ax TinyProxy LOG connect to ad.adserverplus.com:80 mx1.u4gf.com - - [17/Oct/2012 07:38:53] "GET http://ad.tagjunction.com/imp?Z=160x600&s=2959021&T=3&_salt=1516586745&B=12&m=2&u=http%3A%2F%2Fsunshinefelling.com%2Findex.php%3Fview%3Darticle%26catid%3D45%253Aplus-size-dresses%26id%3D7512%253A2012-01-25-22-42-00%26format%3Dpdf%26option%3Dcom_content%26Itemid%3D101&r=1 HTTP/1.0" - - bye bye bye connect to ad.adserverplus.com:80 connect to ad.bharatstudent.com:80 connect to ad.yieldmanager.com:80 142.91.199.250.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/imp?Z=0x0&y=29&s=2913320&_salt=2228719469&B=12&m=2&r=1 HTTP/1.0" - - 173.208.94.117 - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/imp?Z=0x0&y=29&s=3187816&_salt=462045326&B=12&m=2&r=1 HTTP/1.0" - - mx1.a54m.com - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/imp?Z=300x250&s=2887338&T=3&_salt=2925281520&B=12&m=2&u=http%3A%2F%2Fsecretskirt.com%2Findex.php%3Foption%3Dcom_contact%26view%3Dcontact%26id%3D1%26Itemid%3D95&r=1 HTTP/1.0" - - 108.62.75.54.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=300x250&s=3218437&T=3&_salt=2939054384&B=12&m=2&u=http%3A%2F%2Fwww.vifinances.com%2Ffinance-investing%2Finsurance-investment%2Fis-life-insurance-investment-necessarily-the-way-to-go.html&r=1 HTTP/1.0" - - connect to ad.yieldmanager.com:80 connect to ad.globe7.com:80 bye connect to ad.globe7.com:80 connect to ad.globe7.com:80 bye 173.208.94.22 - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=728x90&s=2922824&T=3&_salt=705371051&B=12&m=2&u=%3A%2F%2Fsunshinefelling.com%2Findex.php%3Fview%3Darticle%26catid%3D44%3Amature-womens-fashion%26id%3D6917%3A2012-01-25-22-37-27%26tmpl%3Dcomponent%26print%3D1%26layout%3Ddefault%26page%3D&r=1 HTTP/1.0" - - bye 23.19.10.44.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.globe7.com/st?ad_type=iframe&ad_size=160x600&section=3512129&pub_url=${PUB_URL} HTTP/1.0" - - connect to ad.yieldmanager.com:80 bye 142.91.189.27.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.globe7.com/imp?Z=0x0&y=29&s=3660215&_salt=2921537966&B=12&m=2&r=1 HTTP/1.0" - - connect to ad.scanmedios.com:80 bye 142.91.217.158.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.globaltakeoff.net/st?ad_type=iframe&ad_size=160x600&section=2077929&pub_url=${PUB_URL} HTTP/1.0" - - 23.19.76.194.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=728x90&s=3127996&T=3&_salt=1952612979&B=12&m=2&u=http%3A%2F%2Fwww.oseey.com%2Fpure-core-watch%2Fcarbon-fiber-watch%2Fcarbon-monoxide-poisoning-awareness.html&r=1 HTTP/1.0" - - mx1.e6sb.com - - [17/Oct/2012 07:38:53] "GET http://ad.scanmedios.com/imp?Z=728x90&s=3522638&T=3&_salt=3444993091&B=12&m=2&u=http%3A%2F%2Fsunshinefelling.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D6013%3A2012-01-25-22-25-54%26catid%3D40%3Abig-beautiful-women-fashion%26Itemid%3D96&r=1 HTTP/1.0" - - connect to ad.tagjunction.com:80 connect to ad.yieldmanager.com:80 bye connect to ad.yieldmanager.com:80 23.19.76.154.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/st?ad_type=iframe&ad_size=300x250&section=2569393 HTTP/1.0" - - connect to ads.creafi-online-media.com:80 bye 108.62.109.115.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=0x0&y=29&s=3315330&_salt=2385926515&B=12&m=2&r=1 HTTP/1.0" - - 142.91.217.214.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=160x600&s=3634166&T=3&_salt=1590442300&B=12&m=2&u=http%3A%2F%2Fwealthterritory.com%2Findex.php%3Foption%3Dcom_mailto%26tmpl%3Dcomponent%26link%3DaHR0cDovL3dlYWx0aHRlcnJpdG9yeS5jb20vaW5kZXgucGhwP29wdGlvbj1jb21fY29udGVudCZ2aWV3PWFydGljbGUmaWQ9NDY2NDoyMDExLTA3LTA2LTEzLTI2LTUwJmNhdGlkPTQxOnNlcnZpY2VzJkl0ZW1pZ&r=1 HTTP/1.0" - - 108.62.185.184.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ads.creafi-online-media.com/imp?Z=728x90&s=2885766&T=3&_salt=107120374&B=12&m=2&u=http%3A%2F%2Feconomicccore.com%2Findex.php%3Foption%3Dcom_content%26view%3Dcategory%26layout%3Dblog%26id%3D48%26Itemid%3D98%26limitstart%3D45&r=1 HTTP/1.0" - - bye bye bye connect to ad.adserverplus.com:80 connect to ad.yieldmanager.com:80 connect to ad.tagjunction.com:80 bye 108.62.75.252.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/st?ad_type=iframe&ad_size=728x90&section=3213387&pub_url=${PUB_URL} HTTP/1.0" - - bye connect to ad.tagjunction.com:80 bye connect to ad.yieldmanager.com:80 173.208.94.29 - - [17/Oct/2012 07:38:53] "GET http://ad.tagjunction.com/st?ad_type=iframe&ad_size=728x90&section=3006024&pub_url=${PUB_URL} HTTP/1.0" - - 23.19.31.84.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=0x0&y=29&s=2586703&_salt=2905995697&B=12&m=2&r=1 HTTP/1.0" - - oxx-ef-Words.ipwagon.net - - [17/Oct/2012 07:38:53] "GET http://ad.tagjunction.com/imp?Z=0x0&y=29&s=3630499&_salt=4037530564&B=12&m=2&r=1 HTTP/1.0" - - 142.91.185.53.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.tagjunction.com/imp?Z=0x0&y=29&s=3512541&_salt=1134875077&B=12&m=2&r=1 HTTP/1.0" - - connect to ad.globe7.com:80 108.177.187.37.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=300x250&s=3168350&T=3&_salt=548860046&B=12&m=2&u=http%3A%2F%2Flifehealthyliving.com%2Findex.php%3Fview%3Darticle%26catid%3D34%253Ahealthy-food%26id%3D4681%253A2012-05-16-20-40-19%26tmpl%3Dcomponent%26print%3D1%26layout%3Ddefault%26page%3D%26option%3Dcom_content%26Itemid%3D53&r=1 HTTP/1.0" - - connect to ad.adserverplus.com:80 bye connect to ads.creafi-online-media.com:80 108.177.223.180.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/imp?Z=300x250&s=3331290&T=3&_salt=1270334669&B=12&m=2&u=http%3A%2F%2Fwww.vegls.com%2Faccident-attorneys-firms%2Fauto-accident-attorney%2Ffind-the-correct-auto-accident-attorney.html&r=1 HTTP/1.0" - - bye 142.91.185.38.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.globe7.com/st?ad_type=iframe&ad_size=160x600&section=818253 HTTP/1.0" - - connect to ad.yieldmanager.com:80 bye bye bye 108.62.75.230.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ads.creafi-online-media.com/st?ad_type=pop&ad_size=0x0&section=3323456&banned_pop_types=29&pop_times=1&pop_frequency=86400&pub_url=${PUB_URL} HTTP/1.0" - - connect to ad.adserverplus.com:80 bye connect to ad.adserverplus.com:80 bye 142.91.217.194.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=300x250&s=3068801&T=3&_salt=1246107431&B=12&m=2&u=http%3A%2F%2Fmoodoffashionandbeauty.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D756%3A2011-07-13-13-13-43%26catid%3D36%3Afashion-clothes%26Itemid%3D55&r=1 HTTP/1.0" - - connect to ad.smxchange.com:80 108.62.185.235.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/st?ad_type=iframe&ad_size=300x250&section=3307618&pub_url=${PUB_URL} HTTP/1.0" - - connect to ad.globe7.com:80 bye connect to ad.yieldmanager.com:80 bye bye connect to ad.adserverplus.com:80 connect to ad.yieldmanager.com:80 connect to ad.adserverplus.com:80 connect to ad.yieldmanager.com:80 108.177.168.183.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.globe7.com/imp?Z=300x250&s=3582877&T=3&_salt=3271923155&B=12&m=2&u=http%3A%2F%2Fwomenhealthroad.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D5780%3A2011-12-12-16-56-53%26catid%3D40%3Ahealth-issues%26Itemid%3D96&r=1 HTTP/1.0" - - 23.19.3.100.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=160x600&s=2895969&T=3&_salt=207805714&B=12&m=2&u=http%3A%2F%2Feconomicccore.com%2Findex.php%3Fview%3Darticle%26catid%3D46%253Aeconomic-news%26id%3D6079%253A2011-09-29-07-39-13%26format%3Dpdf%26option%3Dcom_content%26Itemid%3D96&r=1 HTTP/1.0" - - bye 142.91.199.212.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/st?ad_type=iframe&ad_size=300x250&section=2956039&pub_url=${PUB_URL} HTTP/1.0" - - bye 142.91.189.169.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=728x90&s=3004691&T=3&_salt=2747591679&B=12&m=2&u=http%3A%2F%2Fwww.qtsfinancial.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D5406%3Afinancial-statement-english-page%26catid%3D43%3Afinancial-analysis%26Itemid%3D99&r=1 HTTP/1.0" - - connect to ad.adserverplus.com:80 23.19.31.58.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=0x0&y=29&s=3323560&_salt=3172064457&B=12&m=2&r=1 HTTP/1.0" - - connect to ad.adserverplus.com:80 iei-ix-Words.ipwagon.net - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/imp?Z=728x90&s=3187813&T=3&_salt=1110944041&B=12&m=2&u=http%3A%2F%2Fwww.workinhouses.com%2Fhtml%2Fwallingford-ct-connecticuts-best-places-for-your-home.html&r=1 HTTP/1.0" - - connect to cookex.amp.yahoo.com:80 173.208.94.116 - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/st?ad_type=iframe&ad_size=300x250&section=3213592&pub_url=${PUB_URL} HTTP/1.0" - - bye bye connect to ad.yieldmanager.com:80 connect to ads.creafi-online-media.com:80 bye 108.62.75.99.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/imp?Z=160x600&s=2913321&T=3&_salt=333033369&B=12&m=2&u=http%3A%2F%2Ffashionstreetlight.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D28850%3A2011-12-20-12-59-39%26catid%3D45%3Afashion-accessories%26Itemid%3D101&r=1 HTTP/1.0" - - bye 142.91.217.208.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://cookex.amp.yahoo.com/v2/cexposer/SIG=18kthu27g/*http%3A//ad.yieldmanager.com/imp?Z=300x250&s=2682517&T=3&_salt=1378331643&B=12&m=2&u=http%3A%2F%2Fwww.economicwindows.com%2Findex.php%3Fview%3Darticle%26catid%3D40%253Afinancial-info%26id%3D3854%253A2011-07-06-13-25-37%26format%3Dpdf%26option%3Dcom_content%26Itemid%3D96&r=1 HTTP/1.0" - - bye bye bye 108.62.185.228.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=0x0&y=29&s=3315448&_salt=4241487555&B=12&m=2&r=1 HTTP/1.0" - - 108.62.185.220.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ads.creafi-online-media.com/st?ad_type=iframe&ad_size=728x90&section=3269968 HTTP/1.0" - - connect to ad.tagjunction.com:80 bye connect to ad.globe7.com:80 bye 142.91.185.47.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.tagjunction.com/st?ad_type=pop&ad_size=0x0&section=2958317&banned_pop_types=29&pop_times=1&pop_frequency=0&pub_url=${PUB_URL} HTTP/1.0" - - bye 108.177.168.183.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.globe7.com/imp?Z=160x600&s=3582877&T=3&_salt=1313872999&B=12&m=2&u=http%3A%2F%2Fwomenhealthroad.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D5753%3A2011-12-12-16-56-46%26catid%3D40%3Ahealth-issues%26Itemid%3D96&r=1 HTTP/1.0" - - connect to ad.tagjunction.com:80 bye connect to ad.globe7.com:80 bye connect to ad.adserverplus.com:80 108.62.75.53.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.tagjunction.com/imp?Z=300x250&s=3127172&T=3&_salt=2152278771&B=12&m=2&u=http%3A%2F%2Fwww.oslims.com%2Ffashion-coffee%2Ffashion-slimming-coffee%2Fso-whats-your-poison-coffee-or-tea.html&r=1 HTTP/1.0" - - connect to ad.yieldmanager.com:80 bye bye 108.62.75.170.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/imp?Z=0x0&y=29&s=2909210&_salt=1773835502&B=12&m=2&r=1 HTTP/1.0" - - 23.19.79.3.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.globe7.com/st?ad_type=iframe&ad_size=728x90&section=3571505&pub_url=${PUB_URL} HTTP/1.0" - - 142.91.217.216.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=160x600&s=3630472&T=3&_salt=462936220&B=12&m=2&u=http%3A%2F%2Fwww.economicwindows.com%2Findex.php%3Fview%3Darticle%26catid%3D41%253Afinancial-services%26id%3D4854%253A2011-07-06-13-26-56%26tmpl%3Dcomponent%26print%3D1%26layout%3Ddefault%26page%3D%26option%3Dcom_content%26Itemid%3D97&r=1 HTTP/1.0" - - connect to ad.yieldmanager.com:80 connect to ad.adserverplus.com:80 connect to ad.yieldmanager.com:80 bye connect to ad.yieldmanager.com:80 bye connect to ad.yieldmanager.com:80 142.91.189.176.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=160x600&s=3187822&T=3&_salt=325267799&B=12&m=2&u=http%3A%2F%2Feconomysea.com%2Findex.php%3Foption%3Dcom_mailto%26tmpl%3Dcomponent%26link%3DaHR0cDovL2Vjb25vbXlzZWEuY29tL2luZGV4LnBocD9vcHRpb249Y29tX2NvbnRlbnQmdmlldz1hcnRpY2xlJmlkPTYzNDk6MjAxMS0wOS0yOC0yMC0wNC0xOSZjYXRpZD00NzplY29ub21pYy1uZXdzJkl0ZW1pZD05Nw&r=1 HTTP/1.0" - - connect to ad.adserverplus.com:80 142.91.190.240.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=160x600&s=2956040&T=3&_salt=3354730349&B=12&m=2&u=http%3A%2F%2Fdomarketings.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D279%3AWhy-Contractor-Leads-Are-Best-For-Getting-Ideal-Construction-Prospects%26catid%3D2%3Abusiness&r=1 HTTP/1.0" - - bye 108.62.75.6.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=160x600&s=3323456&T=3&_salt=1244915826&B=12&m=2&u=http%3A%2F%2Fdomarketings.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D989%3AThe-Basics-of-Failure-Mode-and-Effective-Analysis%26catid%3D2%3Abusiness&r=1 HTTP/1.0" - - bye 142.91.217.220.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=728x90&s=2921135&T=3&_salt=1337464905&B=12&m=2&u=http%3A%2F%2Financezone.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D7236%3A2011-09-05-19-56-54%26catid%3D49%3Acareer-banking%26Itemid%3D99&r=1 HTTP/1.0" - - bye connect to ad.yieldmanager.com:80 108.62.178.229.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/st?ad_type=iframe&ad_size=160x600&section=3168350&pub_url=${PUB_URL} HTTP/1.0" - - connect to ad.yieldmanager.com:80 108.177.168.187.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.smxchange.com/st?ad_type=iframe&ad_size=300x250&section=3285387&pop_nofreqcap=1&pub_url=${PUB_URL} HTTP/1.0" - - skg-wr-Words.ipwagon.net - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=0x0&y=29&s=3153972&_salt=3512711469&B=12&m=2&r=1 HTTP/1.0" - - bye connect to ad.yieldmanager.com:80 bye connect to ad.yieldmanager.com:80 mx1.u4gf.com - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=160x600&s=2959021&T=3&_salt=1516586745&B=12&m=2&u=http%3A%2F%2Fsunshinefelling.com%2Findex.php%3Fview%3Darticle%26catid%3D45%253Aplus-size-dresses%26id%3D7512%253A2012-01-25-22-42-00%26format%3Dpdf%26option%3Dcom_content%26Itemid%3D101&r=1 HTTP/1.0" - -

    Read the article

  • How to convert a DataSet object into an ObjectContext (Entity Framework) object on the fly?

    - by Marcel
    Hi all, I have an existing SQL Server database, where I store data from large specific log files (often 100 MB and more), one per database. After some analysis, the database is deleted again. From the database, I have created both a Entity Framework Model and a DataSet Model via the Visual Studio designers. The DataSet is only for bulk importing data with SqlBulkCopy, after a quite complicated parsing process. All queries are then done using the Entity Framework Model, whose CreateQuery Method is exposed via an interface like this public IQueryable<TTarget> GetResults<TTarget>() where TTarget : EntityObject, new() { return this.Context.CreateQuery<TTarget>(typeof(TTarget).Name); } Now, sometimes my files are very small and in such a case I would like to omit the import into the database, but just have a an in-memory representation of the data, accessible as Entities. The idea is to create the DataSet, but instead of bulk importing, to directly transfer it into an ObjectContext which is accessible via the interface. Does this make sense? Now here's what I have done for this conversion so far: I traverse all tables in the DataSet, convert the single rows into entities of the corresponding type and add them to instantiated object of my typed Entity context class, like so MyEntities context = new MyEntities(); //create new in-memory context ///.... //get the item in the navigations table MyDataSet.NavigationResultRow dataRow = ds.NavigationResult.First(); //here, a foreach would be necessary in a true-world scenario NavigationResult entity = new NavigationResult { Direction = dataRow.Direction, ///... NavigationResultID = dataRow.NavigationResultID }; //convert to entities context.AddToNavigationResult(entity); //add to entities ///.... A very tedious work, as I would need to create a converter for each of my entity type and iterate over each table in the DataSet I have. Beware, if I ever change my database model.... Also, I have found out, that I can only instantiate MyEntities, if I provide a valid connection string to a SQL Server database. Since I do not want to actually write to my fully fledged database each time, this hinders my intentions. I intend to have only some in-memory proxy database. Can I do simpler? Is there some automated way of doing such a conversion, like generating an ObjectContext out of a DataSet object? P.S: I have seen a few questions about unit testing that seem somewhat related, but not quite exact.

    Read the article

  • Removing expired certificates from LDS (new ver of ADAM)

    - by jonthebrewer
    Hi all. This is my situation: We are in the process of replacing a certificate store currently hosted on Sun's iPlanet with Microsoft's Lightweight Directory Services (new version of ADAM with Server 2008). These certificates have been imported into LDS into an application partition (say o=myorg, C=AU). Under this structure I have around 40,000 OU's each one representing a customer under each customers OU are one or more user (iNetOrg) objects (around 60,000 in all). In each user are one or more certificates in the UserCertificate attribute. A combination of in-house written application code and proprietory PKI code reads and publishes these certficates to validate financial transactions. As the LDAP path of the certificates is stored within the customer certificates (and within the application code) and there is zero appetite for changing any of the code, I have had to pick up the iPlanet directory as a whole and dump it in LDS in the same structure. (I will not be using or hosting a Microsoft CA, just implementing an LDAP compliant directory to host these certificates) We have fully tested the application using the data in LDS and everything works fine - here is my dilema and question (finally, phew!) There was no process put in place for removing revoked or expired certificates, consequently the vast majority of the data is completely useless, the system has been running for about 8 years! I have done a quick analysis and I estimate that at least 80% of the data is no longer valid. As I am taking on responsibility for managing the directory I would like to start with a clean directory. Does anyone have any idea how I can cleanup these expired certificates. I am not a highly experienced scripter but have some background in VB. I have been researching the use of CAPICOM and have a feeling this may be able to be used but in exactly what way I am not sure?? I would prefer to write a script that I could specify an expiration date (say any certs that expired prior to 2010) then run against the LDS paritition. This way I can reuse the script periodically to cleanup the directory (as mentioned above - I have no way to adjust the applications that are writing the certs, this is with a third party). Another, less attractive, alternative is to massage the LDIF file (2.7 million lines!) to rip the certs out prior to the import Any help and advice MUCH appreciated. Cheers Jon

    Read the article

  • Please give the solution of the following programs in R Programming

    - by NEETHU
    Table below gives data concerning the performance of 28 national football league teams in 1976.It is suspected that the no. of yards gained rushing by opponents(x8) has an effect on the no. of games won by a team(y) (a)Fit a simple linear regression model relating games won by y to yards gained rushing by opponents x8. (b)Construct the analysis of variance table and test for significance of regression. (c)Find a 95% CI on the slope. (d)What percent of the total variability in y is explained by this model. (e)Find a 95% CI on the mean number of games won in opponents yards rushing is limited to 2000 yards. Team y x8 1 10 2205 2 11 2096 3 11 1847 4 13 1803 5 10 1457 6 11 1848 7 10 1564 8 11 1821 9 4 2577 10 2 2476 11 7 1984 12 10 1917 13 9 1761 14 9 1709 15 6 1901 16 5 2288 17 5 2072 18 5 2861 19 6 2411 20 4 2289 21 3 2203 22 3 2592 23 4 2053 24 10 1979 25 6 2048 26 8 1786 27 2 2876 28 0 2560 Suppose we would like to use the model developed in problem 1 to predict the no. of games a team will win if it can limit opponents yards rushing to 1800 yards. Find a point estimate of the no. of games won when x8=1800.Find a 905 prediction interval on the no. of games won. The purity of Oxygen produced by a fractionation process is thought to be percentage of Hydrocarbon in the main condenser of the processing unit .20 samples are shown below. Purity(%) Hydrocarbon(%) 86.91 1.02 89.85 1.11 90.28 1.43 86.34 1.11 92.58 1.01 87.33 0.95 86.29 1.11 91.86 0.87 95.61 1.43 89.86 1.02 96.73 1.46 99.42 1.55 98.66 1.55 96.07 1.55 93.65 1.4 87.31 1.15 95 1.01 96.85 0.99 85.2 0.95 90.56 0.98 (a)Fit a simple linear regression model to the data. (b)Test the hypothesis H0:ß=0 (c)Calculate R2 . (d)Find a 95% CI on the slope. (e)Find a 95% CI on the mean purity and the Hydrocarbon % is 1. Consider the Oxygen plant data in Problem3 and assume that purity and Hydrocarbon percentage are jointly normally distributed r.vs (a)What is the correlation between Oxygen purity and Hydrocarbon% (b)Test the hypothesis that ?=0. (c)Construct a 95% CI for ?.

    Read the article

  • Objective-C wrapper API design methodology

    - by Wade Williams
    I know there's no one answer to this question, but I'd like to get people's thoughts on how they would approach the situation. I'm writing an Objective-C wrapper to a C library. My goals are: 1) The wrapper use Objective-C objects. For example, if the C API defines a parameter such as char *name, the Objective-C API should use name:(NSString *). 2) The client using the Objective-C wrapper should not have to have knowledge of the inner-workings of the C library. Speed is not really any issue. That's all easy with simple parameters. It's certainly no problem to take in an NSString and convert it to a C string to pass it to the C library. My indecision comes in when complex structures are involved. Let's say you have: struct flow { long direction; long speed; long disruption; long start; long stop; } flow_t; And then your C API call is: void setFlows(flow_t inFlows[4]); So, some of the choices are: 1) expose the flow_t structure to the client and have the Objective-C API take an array of those structures 2) build an NSArray of four NSDictionaries containing the properties and pass that as a parameter 3) create an NSArray of four "Flow" objects containing the structure's properties and pass that as a parameter My analysis of the approaches: Approach 1: Easiest. However, it doesn't meet the design goals Approach 2: For some reason, this seems to me to be the most "Objective-C" way of doing it. However, each element of the NSDictionary would have to be wrapped in an NSNumber. Now it seems like we're doing an awful lot just to pass the equivalent of a struct. Approach 3: Seems the cleanest to me from an object-oriented standpoint and the extra encapsulation could come in handy later. However, like #2, it now seems like we're doing an awful lot (creating an array, creating and initializing objects) just to pass a struct. So, the question is, how would you approach this situation? Are there other choices I'm not considering? Are there additional advantages or disadvantages to the approaches I've presented that I'm not considering?

    Read the article

  • Setting up NCover for NUnit in FinalBuilder

    - by Lasse V. Karlsen
    I am attempting to set up NCover for usage in my FinalBuilder project, for a .NET 4.0 C# project, but my final coverage output file contains no coverage data. I am using: NCover 3.3.2 NUnit 2.5.4 FinalBuilder 6.3.0.2004 All tools are the latest official as of today. I've finally managed to coax FB into running my unit tests under NCover for the .NET 4.0 project, so I get Tests run: 184, ..., which is correct. However, the final Coverage.xml file output from NCover is almost empty, and looks like this: <?xml version="1.0" encoding="utf-8"?> <!-- saved from NCover 3.0 Export url='http://www.ncover.com/' --> <coverage profilerVersion="3.3.2.6211" driverVersion="3.3.2" exportversion="3" viewdisplayname="" startTime="2010-04-22T08:55:33.7471316Z" measureTime="2010-04-22T08:55:35.3462915Z" projectName="" buildid="27c78ffa-c636-4002-a901-3211a0850b99" coveragenodeid="0" failed="false" satisfactorybranchthreshold="95" satisfactorycoveragethreshold="95" satisfactorycyclomaticcomplexitythreshold="20" satisfactoryfunctionthreshold="80" satisfactoryunvisitedsequencepoints="10" uiviewtype="TreeView" viewguid="C:\Dev\VS.NET\LVK.IoC\LVK.IoC.Tests\bin\Debug\Coverage.xml" viewfilterstyle="None" viewreportstyle="SequencePointCoveragePercentage" viewsortstyle="Name"> <rebasedpaths /> <filters /> <documents> <doc id="0" excluded="false" url="None" cs="" csa="00000000-0000-0000-0000-000000000000" om="0" nid="0" /> </documents> </coverage> The output in FB log is: ... ***************** End Program Output ***************** Execution Time: 1,5992 s Coverage Xml: C:\Dev\VS.NET\LVK.IoC\LVK.IoC.Tests\bin\Debug\Coverage.xml NCover Success My configuration of the FB step for NCover: NCover what?: NUnit test coverage Command: C:\Program Files (x86)\NUnit 2.5.4\bin\net-2.0\nunit-console.exe Command arguments: LVK.IoC.Tests.dll /noshadow /framework:4.0.30319 /process=single /nothread Note: I've tried with and without the /process and /nothread options Working directory: %FBPROJECTDIR%\LVK.IoC.Tests\bin\Debug List of assemblies to profile: %FBPROJECTDIR%\LVK.IoC.Tests\bin\Debug\LVK.IoC.dll Note: I've tried just listing the name of the assembly, both with and without the extension. The documentation for the FB step doesn't help, as it only lists minor sentences for each property, and fails to give examples or troubleshooting hints. Since I want to pull the coverage results into NDepend to run build-time analysis, I want that file to contain the information I need. I am also using TestDriven, and if I right-click the solution file and select "Test with NCover", NCover-explorer opens up with coverage data, and if I ask it to show me the folder with coverage files, in there is an .xml file with the same structure as the one above, just with all the data that should be there, so the tools I have is certainly capable of producing it. Has anyone an idea of what I've configured wrong here?

    Read the article

  • .NET RegEx "Memory Leak" investigation

    - by Kevin Pullin
    I recently looked into some .NET "memory leaks" (i.e. unexpected, lingering GC rooted objects) in a WinForms app. After loading and then closing a huge report, the memory usage did not drop as expected even after a couple of gen2 collections. Assuming that the reporting control was being kept alive by a stray event handler I cracked open WinDbg to see what was happening... Using WinDbg, the !dumpheap -stat command reported a large amount of memory was consumed by string instances. Further refining this down with the !dumpheap -type System.String command I found the culprit, a 90MB string used for the report, at address 03be7930. The last step was to invoke !gcroot 03be7930 to see which object(s) were keeping it alive. My expectations were incorrect - it was not an unhooked event handler hanging onto the reporting control (and report string), but instead it was held on by a System.Text.RegularExpressions.RegexInterpreter instance, which itself is a descendant of a System.Text.RegularExpressions.CachedCodeEntry. Now, the caching of Regexs is (somewhat) common knowledge as this helps to reduce the overhead of having to recompile the Regex each time it is used. But what then does this have to do with keeping my string alive? Based on analysis using Reflector, it turns out that the input string is stored in the RegexInterpreter whenever a Regex method is called. The RegexInterpreter holds onto this string reference until a new string is fed into it by a subsequent Regex method invocation. I'd expect similar behaviour by hanging onto Regex.Match instances and perhaps others. The chain is something like this: Regex.Split, Regex.Match, Regex.Replace, etc Regex.Run RegexScanner.Scan (RegexScanner is the base class, RegexInterpreter is the subclass described above). The offending Regex is only used for reporting, rarely used, and therefore unlikely to be used again to clear out the existing report string. And even if the Regex was used at a later point, it would probably be processing another large report. This is a relatively significant problem and just plain feels dirty. All that said, I found a few options on how to resolve, or at least work around, this scenario. I'll let the community respond first and if no takers come forward I will fill in any gaps in a day or two.

    Read the article

  • Lucene: Wildcards are missing from index

    - by Eleasar
    Hi - i am building a search index that contains special names - containing ! and ? and & and + and ... I have to tread the following searches different: me & you me + you But whatever i do (did try with queryparser escaping before indexing, escaped it manually, tried different indexers...) - if i check the search index with Luke they do not show up (question marks and @-symbols and the like show up) The logic behind is that i am doing partial searches for a live suggestion (and the fields are not that large) so i split it up into "m" and "me" and "+" and "y" and "yo" and "you" and then index it (that way it is way faster than a wildcard query search (and the index size is not a big problem). So what i would need is to also have this special wildcard characters be inserted into the index. This is my code: using System; using System.Collections.Generic; using System.IO; using System.Linq; using System.Text; using Lucene.Net.Analysis; using Lucene.Net.Util; namespace AnalyzerSpike { public class CustomAnalyzer : Analyzer { public override TokenStream TokenStream(string fieldName, TextReader reader) { return new ASCIIFoldingFilter(new LowerCaseFilter(new CustomCharTokenizer(reader))); } } public class CustomCharTokenizer : CharTokenizer { public CustomCharTokenizer(TextReader input) : base(input) { } public CustomCharTokenizer(AttributeSource source, TextReader input) : base(source, input) { } public CustomCharTokenizer(AttributeFactory factory, TextReader input) : base(factory, input) { } protected override bool IsTokenChar(char c) { return c != ' '; } } } The code to create the index: private void InitIndex(string path, Analyzer analyzer) { var writer = new IndexWriter(path, analyzer, true); //some multiline textbox that contains one item per line: var all = new List<string>(txtAllAvailable.Text.Replace("\r","").Split('\n')); foreach (var item in all) { writer.AddDocument(GetDocument(item)); } writer.Optimize(); writer.Close(); } private static Document GetDocument(string name) { var doc = new Document(); doc.Add(new Field( "name", DeNormalizeName(name), Field.Store.YES, Field.Index.ANALYZED)); doc.Add(new Field( "raw_name", name, Field.Store.YES, Field.Index.NOT_ANALYZED)); return doc; } (Code is with Lucene.net in version 1.9.x (EDIT: sorry - was 2.9.x) but is compatible with Lucene from Java) Thx

    Read the article

  • MemoryStream, XmlTextWriter and Warning 4 CA2202 : Microsoft.Usage

    - by rasx
    The Run Code Analysis command in Visual Studio 2010 Ultimate returns a warning when seeing a certain pattern with MemoryStream and XmlTextWriter. This is the warning: Warning 7 CA2202 : Microsoft.Usage : Object 'ms' can be disposed more than once in method 'KinteWritePages.GetXPathDocument(DbConnection)'. To avoid generating a System.ObjectDisposedException you should not call Dispose more than one time on an object.: Lines: 421 C:\Visual Studio 2010\Projects\Songhay.DataAccess.KinteWritePages\KinteWritePages.cs 421 Songhay.DataAccess.KinteWritePages This is the form: static XPathDocument GetXPathDocument(DbConnection connection) { XPathDocument xpDoc = null; var ms = new MemoryStream(); try { using(XmlTextWriter writer = new XmlTextWriter(ms, Encoding.UTF8)) { using(DbDataReader reader = CommonReader.GetReader(connection, Resources.KinteRssSql)) { writer.WriteStartDocument(); writer.WriteStartElement("data"); do { while(reader.Read()) { writer.WriteStartElement("item"); for(int i = 0; i < reader.FieldCount; i++) { writer.WriteRaw(String.Format("<{0}>{1}</{0}>", reader.GetName(i), reader[i].ToString())); } writer.WriteFullEndElement(); } } while(reader.NextResult()); writer.WriteFullEndElement(); writer.WriteEndDocument(); writer.Flush(); ms.Position = 0; xpDoc = new XPathDocument(ms); } } } finally { ms.Dispose(); } return xpDoc; } The same kind of warning is produced for this form: XPathDocument xpDoc = null; using(var ms = new MemoryStream()) { using(XmlTextWriter writer = new XmlTextWriter(ms, Encoding.UTF8)) { using(DbDataReader reader = CommonReader.GetReader(connection, Resources.KinteRssSql)) { //... } } } return xpDoc; By the way, the following form produces another warning: XPathDocument xpDoc = null; var ms = new MemoryStream(); using(XmlTextWriter writer = new XmlTextWriter(ms, Encoding.UTF8)) { using(DbDataReader reader = CommonReader.GetReader(connection, Resources.KinteRssSql)) { //... } } return xpDoc; The above produces the warning: Warning 7 CA2000 : Microsoft.Reliability : In method 'KinteWritePages.GetXPathDocument(DbConnection)', object 'ms' is not disposed along all exception paths. Call System.IDisposable.Dispose on object 'ms' before all references to it are out of scope. C:\Visual Studio 2010\Projects\Songhay.DataAccess.KinteWritePages\KinteWritePages.cs 383 Songhay.DataAccess.KinteWritePages In addition to the following, what are my options?: Supress warning CA2202. Supress warning CA2000 and hope that Microsoft is disposing of MemoryStream (because Reflector is not showing me the source code). Rewrite my legacy code to recognize the wonderful XDocument and LINQ to XML.

    Read the article

  • Debugging site written mainly in JScript with AJAX code injection

    - by blumidoo
    Hello, I have a legacy code to maintain and while trying to understand the logic behind the code, I have run into lots of annoying issues. The application is written mainly in Java Script, with extensive usage of jQuery + different plugins, especially Accordion. It creates a wizard-like flow, where client code for the next step is downloaded in the background by injecting a result of a remote AJAX request. It also uses callbacks a lot and pretty complicated "by convention" programming style (lots of events handlers are created on the fly based on certain object names - e.g. current page name, current step name). Adding to that, the code is very messy and there is no obvious inner structure - the functions are scattered in the code, file names do not reflect the business role of the code, lots of functions and code snippets are most likely not used at all etc. PROBLEM: How to approach this code base, so that the inner flow of the code can be sort-of "reverse engineered" using a suite of smart debugging tools. Ideally, I would like to be able to attach to the running application and step through the code, breaking on each new function call. Also, it would be nice to be able to create a "diagram of calls" in the application (i.e. in order to run a particular page logic, this particular flow of function calls was executed in a particular order). Not to mention to be able to run a coverage analysis, identifying potentially orphaned code fragments. I would like to stress out once more, that it is impossible to understand the inner logic of the application just by looking at the code itself, unless you have LOTS of spare time and beer crates, which I unfortunately do not have :/ (shame...) An IDE of some sort that would aid in extending that code would be also great, but I am currently looking into possibility to use Visual Studio 2010 to do the job, as the site itself is a mix of Classic ASP and ASP.NET (I'd say - 70% Java Script with jQuery, 30% ASP). I have obviously tried FireBug, but I was unable to find a way to define a breakpoint or step into the code, which is "injected" into the client JS using AJAX calls (i.e. the application retrieves the code by invoking an URL and injects it to the client local code). Venkman debugger had similar issues. Any hints would be welcome. Feel free to ask additional questions.

    Read the article

  • Count number of queries executed by NHibernate in a unit test

    - by Bittercoder
    In some unit/integration tests of the code we wish to check that correct usage of the second level cache is being employed by our code. Based on the code presented by Ayende here: http://ayende.com/Blog/archive/2006/09/07/MeasuringNHibernatesQueriesPerPage.aspx I wrote a simple class for doing just that: public class QueryCounter : IDisposable { CountToContextItemsAppender _appender; public int QueryCount { get { return _appender.Count; } } public void Dispose() { var logger = (Logger) LogManager.GetLogger("NHibernate.SQL").Logger; logger.RemoveAppender(_appender); } public static QueryCounter Start() { var logger = (Logger) LogManager.GetLogger("NHibernate.SQL").Logger; lock (logger) { foreach (IAppender existingAppender in logger.Appenders) { if (existingAppender is CountToContextItemsAppender) { var countAppender = (CountToContextItemsAppender) existingAppender; countAppender.Reset(); return new QueryCounter {_appender = (CountToContextItemsAppender) existingAppender}; } } var newAppender = new CountToContextItemsAppender(); logger.AddAppender(newAppender); logger.Level = Level.Debug; logger.Additivity = false; return new QueryCounter {_appender = newAppender}; } } public class CountToContextItemsAppender : IAppender { int _count; public int Count { get { return _count; } } public void Close() { } public void DoAppend(LoggingEvent loggingEvent) { if (string.Empty.Equals(loggingEvent.MessageObject)) return; _count++; } public string Name { get; set; } public void Reset() { _count = 0; } } } With intended usage: using (var counter = QueryCounter.Start()) { // ... do something Assert.Equal(1, counter.QueryCount); // check the query count matches our expectations } But it always returns 0 for Query count. No sql statements are being logged. However if I make use of Nhibernate Profiler and invoke this in my test case: NHibernateProfiler.Intialize() Where NHProf uses a similar approach to capture logging output from NHibernate for analysis via log4net etc. then my QueryCounter starts working. It looks like I'm missing something in my code to get log4net configured correctly for logging nhibernate sql ... does anyone have any pointers on what else I need to do to get sql logging output from Nhibernate?

    Read the article

  • XmlDataProvider and XPath bindings don't allow default namespace of XML data?

    - by Andy Dent
    I am struggling to work out how to use default namespaces with XmlDataProvider and XPath bindings. There's an ugly answer using local-name <Binding XPath="*[local-name()='Name']" /> but that is not acceptable to the client who wants this XAML to be highly maintainable. The fallback is to force them to use non-default namespaces in the report XML but that is an undesirable solution. The XML report file looks like the following. It will only work if I remove xmlns="http://www.acme.com/xml/schemas/report so there is no default namespace. <?xml version="1.0" encoding="utf-8"?> <?xml-stylesheet type='text/xsl' href='PreviewReportImages.xsl'?> <Report xsl:schemaLocation="http://www.acme.com/xml/schemas/report BlahReport.xsd" xmlns:xsl="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://www.acme.com/xml/schemas/report"> <Service>Muncher</Service> <Analysis> <Date>27 Apr 2010</Date> <Time>0:09</Time> <Authoriser>Service Centre Manager</Authoriser> Which I am presenting in a window with XAML: <Window x:Class="AcmeTest.ReportPreview" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" Title="ReportPreview" Height="300" Width="300" > <Window.Resources> <XmlDataProvider x:Key="Data"/> </Window.Resources> <StackPanel Orientation="Vertical" DataContext="{Binding Source={StaticResource Data}, XPath=Report}"> <TextBlock Text="{Binding XPath=Service}"/> </StackPanel> </Window> with code-behind used to load an XmlDocument into the XmlDataProvider (seems the only way to have loading from a file or object varying at runtime). public partial class ReportPreview : Window { private void InitXmlProvider(XmlDocument doc) { XmlDataProvider xd = (XmlDataProvider)Resources["Data"]; xd.Document = doc; } public ReportPreview(XmlDocument doc) { InitializeComponent(); InitXmlProvider(doc); } public ReportPreview(String reportPath) { InitializeComponent(); var doc = new XmlDocument(); doc.Load(reportPath); InitXmlProvider(doc); } }

    Read the article

  • .NET JIT Code Cache leaking?

    - by pitchfork
    We have a server component written in .Net 3.5. It runs as service on a Windows Server 2008 Standard Edition. It works great but after some time (days) we notice massive slowdowns and an increased working set. We expected some kind of memory leak and used WinDBG/SOS to analyze dumps of the process. Unfortunately the GC Heap doesn’t show any leak but we noticed that the JIT code heap has grown from 8MB after the start to more than 1GB after a few days. We don’t use any dynamic code generation techniques by our own. We use Linq2SQL which is known for dynamic code generation but we don’t know if it can cause such a problem. The main question is if there is any technique to analyze the dump and check where all this Host Code Heap blocks that are shown in the WinDBG dumps come from? [Update] In the mean time we did some more analysis and had Linq2SQL as probable suspect, especially since we do not use precompiled queries. The following example program creates exactly the same behaviour where more and more Host Code Heap blocks are created over time. using System; using System.Linq; using System.Threading; namespace LinqStressTest { class Program { static void Main(string[] args) { for (int i = 0; i < 100; ++ i) ThreadPool.QueueUserWorkItem(Worker); while(runs < 1000000) { Thread.Sleep(5000); } } static void Worker(object state) { for (int i = 0; i < 50; ++i) { using (var ctx = new DataClasses1DataContext()) { long id = rnd.Next(); var x = ctx.AccountNucleusInfos.Where(an => an.Account.SimPlayers.First().Id == id).SingleOrDefault(); } } var localruns = Interlocked.Add(ref runs, 1); System.Console.WriteLine("Action: " + localruns); ThreadPool.QueueUserWorkItem(Worker); } static Random rnd = new Random(); static long runs = 0; } } When we replace the Linq query with a precompiled one, the problem seems to disappear.

    Read the article

  • R glm standard error estimate differences to SAS PROC GENMOD

    - by Michelle
    I am converting a SAS PROC GENMOD example into R, using glm in R. The SAS code was: proc genmod data=data0 namelen=30; model boxcoxy=boxcoxxy ~ AGEGRP4 + AGEGRP5 + AGEGRP6 + AGEGRP7 + AGEGRP8 + RACE1 + RACE3 + WEEKEND + SEQ/dist=normal; FREQ REPLICATE_VAR; run; My R code is: parmsg2 <- glm(boxcoxxy ~ AGEGRP4 + AGEGRP5 + AGEGRP6 + AGEGRP7 + AGEGRP8 + RACE1 + RACE3 + WEEKEND + SEQ , data=data0, family=gaussian, weights = REPLICATE_VAR) When I use summary(parmsg2) I get the same coefficient estimates as in SAS, but my standard errors are wildly different. The summary output from SAS is: Name df Estimate StdErr LowerWaldCL UpperWaldCL ChiSq ProbChiSq Intercept 1 6.5007436 .00078884 6.4991975 6.5022897 67911982 0 agegrp4 1 .64607262 .00105425 .64400633 .64813891 375556.79 0 agegrp5 1 .4191395 .00089722 .41738099 .42089802 218233.76 0 agegrp6 1 -.22518765 .00083118 -.22681672 -.22355857 73401.113 0 agegrp7 1 -1.7445189 .00087569 -1.7462352 -1.7428026 3968762.2 0 agegrp8 1 -2.2908855 .00109766 -2.2930369 -2.2887342 4355849.4 0 race1 1 -.13454883 .00080672 -.13612997 -.13296769 27817.29 0 race3 1 -.20607036 .00070966 -.20746127 -.20467944 84319.131 0 weekend 1 .0327884 .00044731 .0319117 .03366511 5373.1931 0 seq2 1 -.47509583 .00047337 -.47602363 -.47416804 1007291.3 0 Scale 1 2.9328613 .00015586 2.9325559 2.9331668 -127 The summary output from R is: Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 6.50074 0.10354 62.785 < 2e-16 AGEGRP4 0.64607 0.13838 4.669 3.07e-06 AGEGRP5 0.41914 0.11776 3.559 0.000374 AGEGRP6 -0.22519 0.10910 -2.064 0.039031 AGEGRP7 -1.74452 0.11494 -15.178 < 2e-16 AGEGRP8 -2.29089 0.14407 -15.901 < 2e-16 RACE1 -0.13455 0.10589 -1.271 0.203865 RACE3 -0.20607 0.09315 -2.212 0.026967 WEEKEND 0.03279 0.05871 0.558 0.576535 SEQ -0.47510 0.06213 -7.646 2.25e-14 The importance of the difference in the standard errors is that the SAS coefficients are all statistically significant, but the RACE1 and WEEKEND coefficients in the R output are not. I have found a formula to calculate the Wald confidence intervals in R, but this is pointless given the difference in the standard errors, as I will not get the same results. Apparently SAS uses a ridge-stabilized Newton-Raphson algorithm for its estimates, which are ML. The information I read about the glm function in R is that the results should be equivalent to ML. What can I do to change my estimation procedure in R so that I get the equivalent coefficents and standard error estimates that were produced in SAS? To update, thanks to Spacedman's answer, I used weights because the data are from individuals in a dietary survey, and REPLICATE_VAR is a balanced repeated replication weight, that is an integer (and quite large, in the order of 1000s or 10000s). The website that describes the weight is here. I don't know why the FREQ rather than the WEIGHT command was used in SAS. I will now test by expanding the number of observations using REPLICATE_VAR and rerunning the analysis.

    Read the article

< Previous Page | 272 273 274 275 276 277 278 279 280 281 282 283  | Next Page >